Project 4: Introduction to Neural Networks: Bank Churn Prediction¶
By Kirti Kamerkar
Problem Statement¶
Context¶
Businesses like banks which provide service have to worry about problem of 'Customer Churn' i.e. customers leaving and joining another service provider. It is important to understand which aspects of the service influence a customer's decision in this regard. Management can concentrate efforts on improvement of service, keeping in mind these priorities.
Objective¶
You as a Data scientist with the bank need to build a neural network based classifier that can determine whether a customer will leave the bank or not in the next 6 months.
Data Dictionary¶
CustomerId: Unique ID which is assigned to each customer
Surname: Last name of the customer
CreditScore: It defines the credit history of the customer.
Geography: A customer’s location
Gender: It defines the Gender of the customer
Age: Age of the customer
Tenure: Number of years for which the customer has been with the bank
NumOfProducts: refers to the number of products that a customer has purchased through the bank.
Balance: Account balance
HasCrCard: It is a categorical variable which decides whether the customer has credit card or not.
EstimatedSalary: Estimated salary
isActiveMember: Is is a categorical variable which decides whether the customer is active member of the bank or not ( Active member in the sense, using bank products regularly, making transactions etc )
Exited : whether or not the customer left the bank within six month. It can take two values ** 0=No ( Customer did not leave the bank ) ** 1=Yes ( Customer left the bank )
Importing necessary libraries¶
#Installing the libraries with the specified version.
!pip install tensorflow scikit-learn matplotlib seaborn numpy pandas -q --user
# Library for data manipulation and analysis.
import pandas as pd
# Fundamental package for scientific computing.
import numpy as np
#splitting datasets into training and testing sets.
from sklearn.model_selection import train_test_split
#Imports tools for data preprocessing including label encoding, one-hot encoding, and standard scaling
from sklearn.preprocessing import LabelEncoder, OneHotEncoder,StandardScaler
#Imports a class for imputing missing values in datasets.
from sklearn.impute import SimpleImputer
#Imports the Matplotlib library for creating visualizations.
import matplotlib.pyplot as plt
# Imports the Seaborn library for statistical data visualization.
import seaborn as sns
# Time related functions.
import time
#Imports functions for evaluating the performance of machine learning models
from sklearn.metrics import confusion_matrix, f1_score,accuracy_score, recall_score, precision_score, classification_report
#Imports the tensorflow,keras and layers.
import tensorflow
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Dense, Input, Dropout,BatchNormalization
from tensorflow.keras import backend
# to suppress unnecessary warnings
import warnings
warnings.filterwarnings("ignore")
Loading the dataset¶
#Connecting google drive and colab.
from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
#Defining dataframe with pandas read_csv()
df=pd.read_csv("/content/drive/MyDrive/AI_LM_BusinessApplication/bank-1.csv")
#To keep original data safe taking copy of it.
data=df.copy()
Data Overview¶
Observing Dataset
# Viewing first 5 rows of dataset
data.head()
| RowNumber | CustomerId | Surname | CreditScore | Geography | Gender | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 15634602 | Hargrave | 619 | France | Female | 42 | 2 | 0.00 | 1 | 1 | 1 | 101348.88 | 1 |
| 1 | 2 | 15647311 | Hill | 608 | Spain | Female | 41 | 1 | 83807.86 | 1 | 0 | 1 | 112542.58 | 0 |
| 2 | 3 | 15619304 | Onio | 502 | France | Female | 42 | 8 | 159660.80 | 3 | 1 | 0 | 113931.57 | 1 |
| 3 | 4 | 15701354 | Boni | 699 | France | Female | 39 | 1 | 0.00 | 2 | 0 | 0 | 93826.63 | 0 |
| 4 | 5 | 15737888 | Mitchell | 850 | Spain | Female | 43 | 2 | 125510.82 | 1 | 1 | 1 | 79084.10 | 0 |
#Viewing last 5 rows of dataset
data.tail()
| RowNumber | CustomerId | Surname | CreditScore | Geography | Gender | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 9995 | 9996 | 15606229 | Obijiaku | 771 | France | Male | 39 | 5 | 0.00 | 2 | 1 | 0 | 96270.64 | 0 |
| 9996 | 9997 | 15569892 | Johnstone | 516 | France | Male | 35 | 10 | 57369.61 | 1 | 1 | 1 | 101699.77 | 0 |
| 9997 | 9998 | 15584532 | Liu | 709 | France | Female | 36 | 7 | 0.00 | 1 | 0 | 1 | 42085.58 | 1 |
| 9998 | 9999 | 15682355 | Sabbatini | 772 | Germany | Male | 42 | 3 | 75075.31 | 2 | 1 | 0 | 92888.52 | 1 |
| 9999 | 10000 | 15628319 | Walker | 792 | France | Female | 28 | 4 | 130142.79 | 1 | 1 | 0 | 38190.78 | 0 |
Checking Datatypes of dataset
#Viewing its shape.Number of columns and number of rows.
data.shape
(10000, 14)
Observations
- It has 10000 Rows and 14 Columns.
# View the information about datatypes, column names and count,etc, of dataset
data.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 10000 entries, 0 to 9999 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 RowNumber 10000 non-null int64 1 CustomerId 10000 non-null int64 2 Surname 10000 non-null object 3 CreditScore 10000 non-null int64 4 Geography 10000 non-null object 5 Gender 10000 non-null object 6 Age 10000 non-null int64 7 Tenure 10000 non-null int64 8 Balance 10000 non-null float64 9 NumOfProducts 10000 non-null int64 10 HasCrCard 10000 non-null int64 11 IsActiveMember 10000 non-null int64 12 EstimatedSalary 10000 non-null float64 13 Exited 10000 non-null int64 dtypes: float64(2), int64(9), object(3) memory usage: 1.1+ MB
Observations
- Dataset has 14 columns and 10000 rows.
- RowNumber and CustomerId are unique Ids which are of type intger.
- It has three coulumns names,Surname,Geography and Gender, wih type object.
- It has two columns with float,9 columns with intger and 3 columns with object data type.
- Existed is target variable.
- It has no missing values.
Checking Missing Values
# Checking for missing/null values
data.isnull().sum()
| 0 | |
|---|---|
| RowNumber | 0 |
| CustomerId | 0 |
| Surname | 0 |
| CreditScore | 0 |
| Geography | 0 |
| Gender | 0 |
| Age | 0 |
| Tenure | 0 |
| Balance | 0 |
| NumOfProducts | 0 |
| HasCrCard | 0 |
| IsActiveMember | 0 |
| EstimatedSalary | 0 |
| Exited | 0 |
Observations:
- It has no missing values.
Checking Duplicate Values
#Checking for duplicate values
data.duplicated().sum()
0
Observation
- Dataset has no missing and duplicate values.
Checking Stastics of Dataset
# Observing stastical analysis of data
data.describe(include='all').T
| count | unique | top | freq | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| RowNumber | 10000.0 | NaN | NaN | NaN | 5000.5 | 2886.89568 | 1.0 | 2500.75 | 5000.5 | 7500.25 | 10000.0 |
| CustomerId | 10000.0 | NaN | NaN | NaN | 15690940.5694 | 71936.186123 | 15565701.0 | 15628528.25 | 15690738.0 | 15753233.75 | 15815690.0 |
| Surname | 10000 | 2932 | Smith | 32 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| CreditScore | 10000.0 | NaN | NaN | NaN | 650.5288 | 96.653299 | 350.0 | 584.0 | 652.0 | 718.0 | 850.0 |
| Geography | 10000 | 3 | France | 5014 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| Gender | 10000 | 2 | Male | 5457 | NaN | NaN | NaN | NaN | NaN | NaN | NaN |
| Age | 10000.0 | NaN | NaN | NaN | 38.9218 | 10.487806 | 18.0 | 32.0 | 37.0 | 44.0 | 92.0 |
| Tenure | 10000.0 | NaN | NaN | NaN | 5.0128 | 2.892174 | 0.0 | 3.0 | 5.0 | 7.0 | 10.0 |
| Balance | 10000.0 | NaN | NaN | NaN | 76485.889288 | 62397.405202 | 0.0 | 0.0 | 97198.54 | 127644.24 | 250898.09 |
| NumOfProducts | 10000.0 | NaN | NaN | NaN | 1.5302 | 0.581654 | 1.0 | 1.0 | 1.0 | 2.0 | 4.0 |
| HasCrCard | 10000.0 | NaN | NaN | NaN | 0.7055 | 0.45584 | 0.0 | 0.0 | 1.0 | 1.0 | 1.0 |
| IsActiveMember | 10000.0 | NaN | NaN | NaN | 0.5151 | 0.499797 | 0.0 | 0.0 | 1.0 | 1.0 | 1.0 |
| EstimatedSalary | 10000.0 | NaN | NaN | NaN | 100090.239881 | 57510.492818 | 11.58 | 51002.11 | 100193.915 | 149388.2475 | 199992.48 |
| Exited | 10000.0 | NaN | NaN | NaN | 0.2037 | 0.402769 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
Observations
- RowNumber and CustomerId are unique identifier of customers.
- There are 2932 Surnames and "Smith" is most popular Surname.
- CreditScore is equally distributed with average value of 650.5.Min score is 350 and max is 850.There is little bit difference between average and median value which is acceptable.
- Unique values for Geography are 3.Most of the customers are from France(5014).
- Two types of Gender.Most of the customers are Male(5457).
- Age opf Customers ranges between 18 to 92 which is a relatively wide age distribution.Average age is 39yrs.Since the mean (38.92) is slightly higher than the median (37.0), the distribution might be slightly skewed to the right.
- Tenure is in between 0 to 10ys.Average tenure are 5yrs which is same as Median.There is no skewness in data.
- Balance of the customers ranges between 0 to $250898.09.The spread is wide.Average value is median values has huge difference.Median is greater than mean which shows skewness in data towards left.
- NumberOfProducts are from 1 to 4.Average value is 1.53 and median is 1 which shows skewness towars right.
- HasCrCard is categorical variable which has value 0 and 1.Maximum customers has credit card.
- IsActiveMember is categorical varible and maximum customers are active member.
- EstimatedSalary is from 11.58 to 199992.48.It is wide spread.Which shows skewness in data.
- Maximum customers are Exited.
Exploratory Data Analysis¶
Defining Functions To Plot Histogram and Boxplot Combination and to plot Stacked Bar plot for categorical data
# function to plot a boxplot and a histogram along the same scale.
def histogram_boxplot(data, feature, figsize=(12, 7), kde=False, bins=None):
"""
Boxplot and histogram combined
data: dataframe
feature: dataframe column
figsize: size of figure (default (12,7))
kde: whether to the show density curve (default False)
bins: number of bins for histogram (default None)
"""
f2, (ax_box2, ax_hist2) = plt.subplots(
nrows=2, # Number of rows of the subplot grid= 2
sharex=True, # x-axis will be shared among all subplots
gridspec_kw={"height_ratios": (0.25, 0.75)},
figsize=figsize,
) # creating the 2 subplots
sns.boxplot(
data=data, x=feature, ax=ax_box2, showmeans=True, color="violet"
) # boxplot will be created and a triangle will indicate the mean value of the column
sns.histplot(
data=data, x=feature, kde=kde, ax=ax_hist2, bins=bins, palette="winter"
) if bins else sns.histplot(
data=data, x=feature, kde=kde, ax=ax_hist2
) # For histogram
ax_hist2.axvline(
data[feature].mean(), color="green", linestyle="--"
) # Add mean to the histogram
ax_hist2.axvline(
data[feature].median(), color="black", linestyle="-"
) # Add median to the histogram
# Function to create labeled barplots
def labeled_barplot(data, feature, perc=False, n=None):
"""
Barplot with percentage at the top
data: dataframe
feature: dataframe column
perc: whether to display percentages instead of count (default is False)
n: displays the top n category levels (default is None, i.e., display all levels)
"""
total = len(data[feature]) # length of the column
count = data[feature].nunique()
if n is None:
plt.figure(figsize=(count + 1, 5))
else:
plt.figure(figsize=(n + 1, 5))
plt.xticks(rotation=90, fontsize=15)
ax = sns.countplot(
data=data,
x=feature,
palette="Paired",
order=data[feature].value_counts().index[:n].sort_values(),
)
for p in ax.patches:
if perc == True:
label = "{:.1f}%".format(
100 * p.get_height() / total
) # percentage of each class of the category
else:
label = p.get_height() # count of each level of the category
x = p.get_x() + p.get_width() / 2 # width of the plot
y = p.get_height() # height of the plot
ax.annotate(
label,
(x, y),
ha="center",
va="center",
size=12,
xytext=(0, 5),
textcoords="offset points",
) # annotate the percentage
plt.show() # show the plot
# function to plot stacked bar chart
def stacked_barplot(data, predictor, target):
"""
Print the category counts and plot a stacked bar chart
data: dataframe
predictor: independent variable
target: target variable
"""
count = data[predictor].nunique()
sorter = data[target].value_counts().index[-1]
tab1 = pd.crosstab(data[predictor], data[target], margins=True).sort_values(
by=sorter, ascending=False
)
print(tab1)
print("-" * 120)
tab = pd.crosstab(data[predictor], data[target], normalize="index").sort_values(
by=sorter, ascending=False
)
tab.plot(kind="bar", stacked=True, figsize=(count + 1, 5))
plt.legend(
loc="lower left", frameon=False,
)
plt.legend(loc="upper left", bbox_to_anchor=(1, 1))
plt.show()
### Function to plot distributions
def distribution_plot_wrt_target(data, predictor, target):
fig, axs = plt.subplots(2, 2, figsize=(12, 10))
target_uniq = data[target].unique()
axs[0, 0].set_title("Distribution of target for target=" + str(target_uniq[0]))
sns.histplot(
data=data[data[target] == target_uniq[0]],
x=predictor,
kde=True,
ax=axs[0, 0],
color="teal",
)
axs[0, 1].set_title("Distribution of target for target=" + str(target_uniq[1]))
sns.histplot(
data=data[data[target] == target_uniq[1]],
x=predictor,
kde=True,
ax=axs[0, 1],
color="orange",
)
axs[1, 0].set_title("Boxplot w.r.t target")
sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow")
axs[1, 1].set_title("Boxplot (without outliers) w.r.t target")
sns.boxplot(
data=data,
x=target,
y=predictor,
ax=axs[1, 1],
showfliers=False,
palette="gist_rainbow",
)
plt.tight_layout()
plt.show()
#Grouping Together numerical variables and categorical variables
data_num=data.select_dtypes(include='number')
data_cat=data.select_dtypes(include='object')
data_num.head()
| RowNumber | CustomerId | CreditScore | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | Exited | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 15634602 | 619 | 42 | 2 | 0.00 | 1 | 1 | 1 | 101348.88 | 1 |
| 1 | 2 | 15647311 | 608 | 41 | 1 | 83807.86 | 1 | 0 | 1 | 112542.58 | 0 |
| 2 | 3 | 15619304 | 502 | 42 | 8 | 159660.80 | 3 | 1 | 0 | 113931.57 | 1 |
| 3 | 4 | 15701354 | 699 | 39 | 1 | 0.00 | 2 | 0 | 0 | 93826.63 | 0 |
| 4 | 5 | 15737888 | 850 | 43 | 2 | 125510.82 | 1 | 1 | 1 | 79084.10 | 0 |
#Converting categorical numeric variables to category
data_cat['HasCrCard']=data_num['HasCrCard'].astype('category')
data_cat['IsActiveMember']=data_num['IsActiveMember'].astype('category')
data_cat.head()
| Surname | Geography | Gender | HasCrCard | IsActiveMember | |
|---|---|---|---|---|---|
| 0 | Hargrave | France | Female | 1 | 1 |
| 1 | Hill | Spain | Female | 0 | 1 |
| 2 | Onio | France | Female | 1 | 0 |
| 3 | Boni | France | Female | 0 | 0 |
| 4 | Mitchell | Spain | Female | 1 | 1 |
Univariate Analysis¶
Distribution of Target variable Exited
# Find Distribution of Target variable Exited
labeled_barplot(data,'Exited',perc=True)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observation
- 0=No ( Customer did not leave the bank )
- 1=Yes ( Customer left the bank )
- Target variable "Exited" has imbalanced distribution of customers who left the bank and not.
- 79.6% of customers are still working with bank and 20.4% of customers left the bank.
Surname
#Observing Surname
labeled_barplot(data,'Surname',n=10)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observations
- Top 10 surnames are Brown,Genovese,Maclean,Martin,Scott,Shih,Smith,Walker,Write and Yeh.
- Smith has highest count (32).
CreditScore
# Observing Creditscore
histogram_boxplot(data,'CreditScore')
Observations
- CreditScore data is uniformly distributed but it has outliers on right side with slightly skewness towards left.
- Average value(650) is slightly less than its median(652) which is considerable.
Geography
# Observing Geography
labeled_barplot(data,'Geography',perc=True)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observations
- Geography has three unique values,"France","Germany" and "Spain".
- 50% of customers are from France and Customer from Germany and Spain are equally distributed.
Age
#observing Age
histogram_boxplot(data,'Age')
Observations
- Age is right skewed with outliers on left side.
- A most of the customers having age 30 to 50yrs.
- Average age is 39ys and median is 37yrs which is acceptable.
Tenure
#Observing Tenure
histogram_boxplot(data,'Tenure')
labeled_barplot(data,'Tenure',perc=True)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observations
- Tenure is uniformly distributed.
- Most of the customers are with bank for 1 yr to 9 yrs.
- Approximately 5% customers are with bank for 10yrs and no more.
Balance
#Observing Balance
histogram_boxplot(data,'Balance')
Observations
- The mean (76485.889288) is less than the median (97198.54), indicating a potential left skew in the distribution of account balances.
- This shows that the minimum account balance in your dataset is 0.
NumOfProducts
# Observing NumOfProducts
labeled_barplot(data,'NumOfProducts',perc=True)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observarions
- 50.8% of the customers have only one product.
- 45.9% of the customers have two products.
- Few customers have 3(2.7%) and 4 products(0.6%)
HasCrCard
#Observing HasCrCard
labeled_barplot(data,'HasCrCard',perc=True)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observarions
- 70.5% Customers has credit card.
- 29.4% Customers has no credit card.
IsActiveMember
# Observing IsActiveMember
labeled_barplot(data,'IsActiveMember',perc=True)
<ipython-input-19-c4024b896aa3>:22: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. ax = sns.countplot(
Observarions
- 51.5% customers are active members where as 48.5% are non active members which are nearly equally distributed.
EstimatedSalary
# Observing Estimated Salary
histogram_boxplot(data,'EstimatedSalary')
Observarions
- EstimatedSalary is evenly distributed between 0 to 200K.
Bivariate Analysis¶
Age and Existed
# Observating Age and Existed
distribution_plot_wrt_target(data,'Age','Exited')
<ipython-input-16-3732c5f6b024>:28: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow") <ipython-input-16-3732c5f6b024>:31: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(
Observations -Age is evenly distributed with respect to target varible,but Not Exited has more outliers than Exited.
- Observed the median age of customers who have exited is about 45, whereas the median age of customers who have not exited is about 35.
- Observed the max age of customers who have exited is much greated than the max age of customers who have not exited if outliers are ignored.
CreditScore And Exited
#Observing distribution of Creditscore and Exited
distribution_plot_wrt_target(data,'CreditScore','Exited')
<ipython-input-16-3732c5f6b024>:28: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow") <ipython-input-16-3732c5f6b024>:31: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(
Observations
- CreditScore is evenly distributed with respect to Exited.Exited with few outliers.
- CreditScore is in the range of 400 to 900 for not exited and 200 to 900 for Exited.
- Median is almost same for both exited and not exited customers,it's approximately 650.
- The minimum CreditScore of customers who have exited is lower than the min CreditScore of customers who have not exited.
Geography And Exited
#Distribution of Geography with respect to Exited
stacked_barplot(data,'Geography','Exited')
Exited 0 1 All Geography All 7963 2037 10000 Germany 1695 814 2509 France 4204 810 5014 Spain 2064 413 2477 ------------------------------------------------------------------------------------------------------------------------
Observations
- Germany has more Exited customer than other two.Spain and Franse has evenly distribution for both Exited and not exited customers.
Gender And Exited
#Distribution of Gender and Exited
stacked_barplot(data,'Gender','Exited')
Exited 0 1 All Gender All 7963 2037 10000 Female 3404 1139 4543 Male 4559 898 5457 ------------------------------------------------------------------------------------------------------------------------
Observations
- Observed slightly more females have exited than men.
- Although more females have exited, the number of men who have exited is similar.
Tenure And Exited
#distribution of Exited and Tenure
stacked_barplot(data,'Tenure','Exited')
Exited 0 1 All Tenure All 7963 2037 10000 1 803 232 1035 9 771 213 984 3 796 213 1009 5 803 209 1012 4 786 203 989 2 847 201 1048 8 828 197 1025 6 771 196 967 7 851 177 1028 10 389 101 490 0 318 95 413 ------------------------------------------------------------------------------------------------------------------------
Observations
- Mostly distribution ratio of exited and not exited is similar for tenure categories.
NumOfProducts And Exited
#Distribution of NumProducts and Exited
stacked_barplot(data,'NumOfProducts','Exited')
Exited 0 1 All NumOfProducts All 7963 2037 10000 1 3675 1409 5084 2 4242 348 4590 3 46 220 266 4 0 60 60 ------------------------------------------------------------------------------------------------------------------------
Observations
- Distribution ratio of Exited and Not Exited is different for all types of products.
- All customers with product number 4 are exited.
- Maximum number of customers with product 3 have exited.
- Customer with two products are not exited in large amount as compared to only one product.
Balance And Exited
#distribution of Balance and Exited
distribution_plot_wrt_target(data,'Balance','Exited')
<ipython-input-16-3732c5f6b024>:28: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow") <ipython-input-16-3732c5f6b024>:31: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(
Observations
- Both exited and not exited customers are negatively skewed w.r.t. balance.
- Not exited customer has mimimum balance of 0.
- Median of less than 100k for not Exited and more than 100k for Exited.
- Balance range for exited customer is higher than not exited.
HasCrCard And Exited
# Distribution of HasCrCard and Exited
stacked_barplot(data,'HasCrCard','Exited')
Exited 0 1 All HasCrCard All 7963 2037 10000 1 5631 1424 7055 0 2332 613 2945 ------------------------------------------------------------------------------------------------------------------------
Observations
- Distribution Ratio of Exited and Not Exited Customers with respect to hasCrcard is same.
EstimatedSalary and Exited
#Distribution of EstimatedSalary and Exited
distribution_plot_wrt_target(data,'EstimatedSalary','Exited')
<ipython-input-16-3732c5f6b024>:28: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(data=data, x=target, y=predictor, ax=axs[1, 0], palette="gist_rainbow") <ipython-input-16-3732c5f6b024>:31: FutureWarning: Passing `palette` without assigning `hue` is deprecated and will be removed in v0.14.0. Assign the `x` variable to `hue` and set `legend=False` for the same effect. sns.boxplot(
Observations
- Distrubution of exited and not exited customers with respect to EstimatedSalary is same.
- Both have median EstimatedSalary is 100K and in between range is 0 to 200K.
isActiveMember And Exited
#Distribution of isActiveMember and Exited
stacked_barplot(data,'IsActiveMember','Exited')
Exited 0 1 All IsActiveMember All 7963 2037 10000 0 3547 1302 4849 1 4416 735 5151 ------------------------------------------------------------------------------------------------------------------------
OBervations
- Distribution ratio of Exited and Not exited for not active members is more than active members.
Multivariate Analysis
# Created a correlation matrix to show any correlations between non-categorical columns.
# Values of 1 are highly positively correlated, values of -1 are highly negatively correlated.
plt.figure(figsize=(15, 7))
# Selecting only numerical features for correlation calculation
numerical_df = df.select_dtypes(include=np.number)
sns.heatmap(numerical_df.corr(), annot=True, vmin=-1, vmax=1, fmt=".2f", cmap="Spectral")
plt.show()
Observations
- No significant correlation exists between variables.
- There is some slight positive correlation between Age and Exited.
- There is some slight negative correlation between Balance and NumOfProducts.
Data Preprocessing¶
Train-validation-test Split¶
- Split data into training, validation, and test sets.
- Models will be trained on training data, and evaluated on validation data.
- The best models will be tuned and finally evaluated on the test data.
# Creating the independent variable data frame.
X = data.drop(['Exited','Surname','RowNumber','CustomerId'], axis=1)
#Convert Object types to category
X['Geography']=X['Geography'].astype('category')
X['Gender']=X['Gender'].astype('category')
# Creating the dependent variable data frame.
y = data[['Exited']]
X.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 10000 entries, 0 to 9999 Data columns (total 10 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 CreditScore 10000 non-null int64 1 Geography 10000 non-null category 2 Gender 10000 non-null category 3 Age 10000 non-null int64 4 Tenure 10000 non-null int64 5 Balance 10000 non-null float64 6 NumOfProducts 10000 non-null int64 7 HasCrCard 10000 non-null int64 8 IsActiveMember 10000 non-null int64 9 EstimatedSalary 10000 non-null float64 dtypes: category(2), float64(2), int64(6) memory usage: 644.9 KB
# Splitting data into training and temp data frames.
X_train, X_temp, y_train, y_temp = train_test_split(X, y, test_size=0.3, random_state=1)
# Splitting temp data frame into validation and test data frames.
X_val, X_test, y_val, y_test = train_test_split(X_temp, y_temp, test_size=0.4, random_state=1)
#Checking the shape of train,test and val data
X_train.shape, X_val.shape, X_test.shape
((7000, 10), (1800, 10), (1200, 10))
X_train.head()
| CreditScore | Geography | Gender | Age | Tenure | Balance | NumOfProducts | HasCrCard | IsActiveMember | EstimatedSalary | |
|---|---|---|---|---|---|---|---|---|---|---|
| 2228 | 644 | France | Female | 37 | 8 | 0.00 | 2 | 1 | 0 | 20968.88 |
| 5910 | 481 | France | Female | 39 | 6 | 0.00 | 1 | 1 | 1 | 24677.54 |
| 1950 | 680 | France | Female | 37 | 10 | 123806.28 | 1 | 1 | 0 | 81776.84 |
| 2119 | 690 | France | Male | 29 | 5 | 0.00 | 2 | 1 | 0 | 108577.97 |
| 5947 | 656 | France | Female | 45 | 7 | 145933.27 | 1 | 1 | 1 | 199392.14 |
# Printing the size of the Training, Validation, and Test data frames.
print("-"*40)
print("Shape of Training Set : ", X_train.shape)
print("Shape of Validation Set", X_val.shape)
print("Shape of Test Set : ", X_test.shape)
print("-"*40)
print("Percentage of classes in training set:")
print(y_train.value_counts(normalize=True))
print("-"*40)
print("Percentage of classes in validation set:")
print(y_val.value_counts(normalize=True))
print("-"*40)
print("Percentage of classes in test set:")
print(y_test.value_counts(normalize=True))
print("-"*40)
---------------------------------------- Shape of Training Set : (7000, 10) Shape of Validation Set (1800, 10) Shape of Test Set : (1200, 10) ---------------------------------------- Percentage of classes in training set: Exited 0 0.798571 1 0.201429 Name: proportion, dtype: float64 ---------------------------------------- Percentage of classes in validation set: Exited 0 0.790556 1 0.209444 Name: proportion, dtype: float64 ---------------------------------------- Percentage of classes in test set: Exited 0 0.791667 1 0.208333 Name: proportion, dtype: float64 ----------------------------------------
- There is imbalance in Target variable distribution.
Dummy Variable Creation¶
# Encoding categorical variables for use in models.
# Dropping first of each encoded column to reduce data frame size.
#categorical columns
cat_cols = ['Geography','Gender']
# Encoding X_train data frame categorical columns.
X_train = pd.get_dummies(X_train,columns=cat_cols,drop_first=True,dtype='uint8')
# Encoding X_val data frame categorical columns.
X_val = pd.get_dummies(X_val,columns=cat_cols,drop_first=True,dtype='uint8')
# Encoding X_test data frame categorical columns.
X_test = pd.get_dummies(X_test,columns=cat_cols,drop_first=True,dtype='uint8')
# Printing shape of new data frames.
print("Shape of X_train:", X_train.shape)
print("Shape of X_val:", X_val.shape)
print("Shape of X_test:", X_test.shape)
Shape of X_train: (7000, 11) Shape of X_val: (1800, 11) Shape of X_test: (1200, 11)
# Printing shape of new data frames.
print("Shape of y_train:", y_train.shape)
print("Shape of y_val:", y_val.shape)
print("Shape of y_test:", y_test.shape)
Shape of y_train: (7000, 1) Shape of y_val: (1800, 1) Shape of y_test: (1200, 1)
# Checking information of new data frame's columns.
X_train.info()
<class 'pandas.core.frame.DataFrame'> Index: 7000 entries, 2228 to 235 Data columns (total 11 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 CreditScore 7000 non-null int64 1 Age 7000 non-null int64 2 Tenure 7000 non-null int64 3 Balance 7000 non-null float64 4 NumOfProducts 7000 non-null int64 5 HasCrCard 7000 non-null int64 6 IsActiveMember 7000 non-null int64 7 EstimatedSalary 7000 non-null float64 8 Geography_Germany 7000 non-null uint8 9 Geography_Spain 7000 non-null uint8 10 Gender_Male 7000 non-null uint8 dtypes: float64(2), int64(6), uint8(3) memory usage: 512.7 KB
# Checking information of new data frame's columns.
X_val.info()
<class 'pandas.core.frame.DataFrame'> Index: 1800 entries, 8043 to 2247 Data columns (total 11 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 CreditScore 1800 non-null int64 1 Age 1800 non-null int64 2 Tenure 1800 non-null int64 3 Balance 1800 non-null float64 4 NumOfProducts 1800 non-null int64 5 HasCrCard 1800 non-null int64 6 IsActiveMember 1800 non-null int64 7 EstimatedSalary 1800 non-null float64 8 Geography_Germany 1800 non-null uint8 9 Geography_Spain 1800 non-null uint8 10 Gender_Male 1800 non-null uint8 dtypes: float64(2), int64(6), uint8(3) memory usage: 131.8 KB
# Checking information of new data frame's columns.
X_test.info()
<class 'pandas.core.frame.DataFrame'> Index: 1200 entries, 5309 to 9000 Data columns (total 11 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 CreditScore 1200 non-null int64 1 Age 1200 non-null int64 2 Tenure 1200 non-null int64 3 Balance 1200 non-null float64 4 NumOfProducts 1200 non-null int64 5 HasCrCard 1200 non-null int64 6 IsActiveMember 1200 non-null int64 7 EstimatedSalary 1200 non-null float64 8 Geography_Germany 1200 non-null uint8 9 Geography_Spain 1200 non-null uint8 10 Gender_Male 1200 non-null uint8 dtypes: float64(2), int64(6), uint8(3) memory usage: 87.9 KB
Data Normalization¶
# Scaling numerical data of independent variables using StandardScaler()
sc=StandardScaler()
temp = sc.fit(X_train[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]])
X_train[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]] = temp.transform(X_train[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]])
X_test[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]] = temp.transform(X_test[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]])
X_val[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]] = temp.transform(X_val[["CreditScore","Age","Tenure","Balance","EstimatedSalary"]])
Model Building¶
Model Evaluation Criterion¶
Write down the logic for choosing the metric that would be the best metric for this business scenario.
Model can make wrong predictions as:
Predicting a customer is exiting and the customer is not exiting (False Positive - FP)
Predicting a customer is not exiting and customer is exiting (False Negative - FN)
Which case is more important?
- False Negatives (FN) are worse - predicting that customer is not exiting but he/she is exiting.
It might cause loss to the bank because the bank will not offer these misidentified customers any initiative to retain their business.
How to reduce this loss i.e need to reduce False Negative?
- Bank would want Recall to be maximized, greater the Recall higher the chances of minimizing FN.
- Hence, the focus should be on increasing Recall or minimizing the FN or in other words identifying the True Positive(i.e. Class 1) so that the bank can retain their customers.
Utility Functions
def plot(history, name):
"""
Function to plot loss/accuracy
history: an object which stores the metrics and losses.
name: can be one of Loss or Accuracy
"""
fig, ax = plt.subplots() #Creating a subplot with figure and axes.
plt.plot(history.history[name]) #Plotting the train accuracy or train loss
plt.plot(history.history['val_'+name]) #Plotting the validation accuracy or validation loss
plt.title('Model ' + name.capitalize()) #Defining the title of the plot.
plt.ylabel(name.capitalize()) #Capitalizing the first letter.
plt.xlabel('Epoch') #Defining the label for the x-axis.
fig.legend(['Train', 'Validation'], loc="outside right upper") #Defining the legend, loc controls the position of the legend.
# defining a function to compute different metrics to check performance of a classification model built using statsmodels
def model_performance_classification(
model, predictors, target, threshold=0.5
):
"""
Function to compute different metrics to check classification model performance
model: classifier
predictors: independent variables
target: dependent variable
threshold: threshold for classifying the observation as class 1
"""
# checking which probabilities are greater than threshold
pred = model.predict(predictors) > threshold
# pred_temp = model.predict(predictors) > threshold
# # rounding off the above values to get classes
# pred = np.round(pred_temp)
acc = accuracy_score(target, pred) # to compute Accuracy
recall = recall_score(target, pred, average='weighted') # to compute Recall
precision = precision_score(target, pred, average='weighted') # to compute Precision
f1 = f1_score(target, pred, average='weighted') # to compute F1-score
# creating a dataframe of metrics
df_perf = pd.DataFrame(
{"Accuracy": acc, "Recall": recall, "Precision": precision, "F1 Score": f1,},
index=[0],
)
return df_perf
As we have are dealing with an imbalance in class distribution, we will be using class weights to allow the model to give proportionally more importance to the minority class.
# Calculate class weights for imbalanced dataset
cw = (y_train.shape[0]) / np.bincount(y_train['Exited'].astype(int)) # Extract values from the 'Exited' column and convert them to integers
# Create a dictionary mapping class indices to their respective class weights
cw_dict = {}
for i in range(cw.shape[0]):
cw_dict[i] = cw[i]
cw_dict
{0: 1.2522361359570662, 1: 4.964539007092198}
Neural Network with SGD Optimizer¶
Model 0
Let's start with a neural network consisting of
- two hidden layers with 32 and 64 neurons respectively
- activation function of ReLU.
- SGD as the optimizer
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#Initializing the neural network
model = Sequential()
model.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model.add(Dense(32,activation="relu"))
model.add(Dense(1,activation="sigmoid"))
model.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
X_train.shape[0]
7000
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
start = time.time()
history = model.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=100,class_weight=cw_dict, verbose=2,batch_size = X_train.shape[0])
end=time.time()
Epoch 1/100 1/1 - 1s - 1s/step - accuracy: 0.3911 - loss: 1.3627 - val_accuracy: 0.4139 - val_loss: 0.7444 Epoch 2/100 1/1 - 0s - 266ms/step - accuracy: 0.4011 - loss: 1.3595 - val_accuracy: 0.4283 - val_loss: 0.7394 Epoch 3/100 1/1 - 0s - 167ms/step - accuracy: 0.4117 - loss: 1.3565 - val_accuracy: 0.4383 - val_loss: 0.7345 Epoch 4/100 1/1 - 0s - 150ms/step - accuracy: 0.4243 - loss: 1.3536 - val_accuracy: 0.4517 - val_loss: 0.7299 Epoch 5/100 1/1 - 0s - 293ms/step - accuracy: 0.4371 - loss: 1.3508 - val_accuracy: 0.4583 - val_loss: 0.7255 Epoch 6/100 1/1 - 0s - 129ms/step - accuracy: 0.4477 - loss: 1.3483 - val_accuracy: 0.4650 - val_loss: 0.7213 Epoch 7/100 1/1 - 0s - 92ms/step - accuracy: 0.4570 - loss: 1.3458 - val_accuracy: 0.4789 - val_loss: 0.7173 Epoch 8/100 1/1 - 0s - 91ms/step - accuracy: 0.4680 - loss: 1.3434 - val_accuracy: 0.4928 - val_loss: 0.7134 Epoch 9/100 1/1 - 0s - 114ms/step - accuracy: 0.4786 - loss: 1.3412 - val_accuracy: 0.4994 - val_loss: 0.7097 Epoch 10/100 1/1 - 0s - 89ms/step - accuracy: 0.4929 - loss: 1.3390 - val_accuracy: 0.5072 - val_loss: 0.7062 Epoch 11/100 1/1 - 0s - 103ms/step - accuracy: 0.5029 - loss: 1.3370 - val_accuracy: 0.5178 - val_loss: 0.7028 Epoch 12/100 1/1 - 0s - 97ms/step - accuracy: 0.5139 - loss: 1.3350 - val_accuracy: 0.5289 - val_loss: 0.6996 Epoch 13/100 1/1 - 0s - 96ms/step - accuracy: 0.5264 - loss: 1.3331 - val_accuracy: 0.5383 - val_loss: 0.6965 Epoch 14/100 1/1 - 0s - 130ms/step - accuracy: 0.5363 - loss: 1.3313 - val_accuracy: 0.5450 - val_loss: 0.6935 Epoch 15/100 1/1 - 0s - 152ms/step - accuracy: 0.5440 - loss: 1.3296 - val_accuracy: 0.5517 - val_loss: 0.6906 Epoch 16/100 1/1 - 0s - 125ms/step - accuracy: 0.5524 - loss: 1.3279 - val_accuracy: 0.5633 - val_loss: 0.6879 Epoch 17/100 1/1 - 0s - 138ms/step - accuracy: 0.5624 - loss: 1.3263 - val_accuracy: 0.5717 - val_loss: 0.6852 Epoch 18/100 1/1 - 0s - 93ms/step - accuracy: 0.5716 - loss: 1.3247 - val_accuracy: 0.5794 - val_loss: 0.6827 Epoch 19/100 1/1 - 0s - 94ms/step - accuracy: 0.5791 - loss: 1.3232 - val_accuracy: 0.5878 - val_loss: 0.6802 Epoch 20/100 1/1 - 0s - 134ms/step - accuracy: 0.5851 - loss: 1.3217 - val_accuracy: 0.5967 - val_loss: 0.6779 Epoch 21/100 1/1 - 0s - 94ms/step - accuracy: 0.5919 - loss: 1.3203 - val_accuracy: 0.6061 - val_loss: 0.6756 Epoch 22/100 1/1 - 0s - 135ms/step - accuracy: 0.6007 - loss: 1.3189 - val_accuracy: 0.6089 - val_loss: 0.6734 Epoch 23/100 1/1 - 0s - 178ms/step - accuracy: 0.6074 - loss: 1.3175 - val_accuracy: 0.6133 - val_loss: 0.6713 Epoch 24/100 1/1 - 0s - 91ms/step - accuracy: 0.6129 - loss: 1.3162 - val_accuracy: 0.6172 - val_loss: 0.6693 Epoch 25/100 1/1 - 0s - 160ms/step - accuracy: 0.6196 - loss: 1.3149 - val_accuracy: 0.6211 - val_loss: 0.6674 Epoch 26/100 1/1 - 0s - 95ms/step - accuracy: 0.6231 - loss: 1.3137 - val_accuracy: 0.6233 - val_loss: 0.6655 Epoch 27/100 1/1 - 0s - 136ms/step - accuracy: 0.6279 - loss: 1.3125 - val_accuracy: 0.6267 - val_loss: 0.6637 Epoch 28/100 1/1 - 0s - 91ms/step - accuracy: 0.6327 - loss: 1.3113 - val_accuracy: 0.6306 - val_loss: 0.6619 Epoch 29/100 1/1 - 0s - 94ms/step - accuracy: 0.6370 - loss: 1.3101 - val_accuracy: 0.6339 - val_loss: 0.6603 Epoch 30/100 1/1 - 0s - 131ms/step - accuracy: 0.6414 - loss: 1.3090 - val_accuracy: 0.6389 - val_loss: 0.6586 Epoch 31/100 1/1 - 0s - 104ms/step - accuracy: 0.6454 - loss: 1.3078 - val_accuracy: 0.6428 - val_loss: 0.6571 Epoch 32/100 1/1 - 0s - 159ms/step - accuracy: 0.6476 - loss: 1.3067 - val_accuracy: 0.6456 - val_loss: 0.6555 Epoch 33/100 1/1 - 0s - 130ms/step - accuracy: 0.6526 - loss: 1.3057 - val_accuracy: 0.6506 - val_loss: 0.6541 Epoch 34/100 1/1 - 0s - 92ms/step - accuracy: 0.6566 - loss: 1.3046 - val_accuracy: 0.6522 - val_loss: 0.6527 Epoch 35/100 1/1 - 0s - 93ms/step - accuracy: 0.6604 - loss: 1.3035 - val_accuracy: 0.6594 - val_loss: 0.6513 Epoch 36/100 1/1 - 0s - 137ms/step - accuracy: 0.6643 - loss: 1.3025 - val_accuracy: 0.6656 - val_loss: 0.6500 Epoch 37/100 1/1 - 0s - 89ms/step - accuracy: 0.6676 - loss: 1.3015 - val_accuracy: 0.6706 - val_loss: 0.6487 Epoch 38/100 1/1 - 0s - 96ms/step - accuracy: 0.6709 - loss: 1.3005 - val_accuracy: 0.6711 - val_loss: 0.6474 Epoch 39/100 1/1 - 0s - 132ms/step - accuracy: 0.6731 - loss: 1.2995 - val_accuracy: 0.6761 - val_loss: 0.6462 Epoch 40/100 1/1 - 0s - 147ms/step - accuracy: 0.6756 - loss: 1.2985 - val_accuracy: 0.6789 - val_loss: 0.6451 Epoch 41/100 1/1 - 0s - 132ms/step - accuracy: 0.6773 - loss: 1.2976 - val_accuracy: 0.6811 - val_loss: 0.6439 Epoch 42/100 1/1 - 0s - 268ms/step - accuracy: 0.6809 - loss: 1.2966 - val_accuracy: 0.6867 - val_loss: 0.6428 Epoch 43/100 1/1 - 0s - 93ms/step - accuracy: 0.6829 - loss: 1.2957 - val_accuracy: 0.6917 - val_loss: 0.6418 Epoch 44/100 1/1 - 0s - 130ms/step - accuracy: 0.6860 - loss: 1.2947 - val_accuracy: 0.6939 - val_loss: 0.6408 Epoch 45/100 1/1 - 0s - 92ms/step - accuracy: 0.6870 - loss: 1.2938 - val_accuracy: 0.6978 - val_loss: 0.6398 Epoch 46/100 1/1 - 0s - 140ms/step - accuracy: 0.6894 - loss: 1.2929 - val_accuracy: 0.6994 - val_loss: 0.6388 Epoch 47/100 1/1 - 0s - 111ms/step - accuracy: 0.6907 - loss: 1.2920 - val_accuracy: 0.6994 - val_loss: 0.6378 Epoch 48/100 1/1 - 0s - 98ms/step - accuracy: 0.6914 - loss: 1.2911 - val_accuracy: 0.7017 - val_loss: 0.6369 Epoch 49/100 1/1 - 0s - 98ms/step - accuracy: 0.6930 - loss: 1.2902 - val_accuracy: 0.7044 - val_loss: 0.6360 Epoch 50/100 1/1 - 0s - 181ms/step - accuracy: 0.6953 - loss: 1.2893 - val_accuracy: 0.7044 - val_loss: 0.6352 Epoch 51/100 1/1 - 0s - 96ms/step - accuracy: 0.6971 - loss: 1.2884 - val_accuracy: 0.7050 - val_loss: 0.6343 Epoch 52/100 1/1 - 0s - 174ms/step - accuracy: 0.6994 - loss: 1.2875 - val_accuracy: 0.7067 - val_loss: 0.6335 Epoch 53/100 1/1 - 0s - 98ms/step - accuracy: 0.7016 - loss: 1.2867 - val_accuracy: 0.7078 - val_loss: 0.6327 Epoch 54/100 1/1 - 0s - 136ms/step - accuracy: 0.7027 - loss: 1.2858 - val_accuracy: 0.7083 - val_loss: 0.6319 Epoch 55/100 1/1 - 0s - 107ms/step - accuracy: 0.7044 - loss: 1.2850 - val_accuracy: 0.7094 - val_loss: 0.6311 Epoch 56/100 1/1 - 0s - 145ms/step - accuracy: 0.7039 - loss: 1.2841 - val_accuracy: 0.7128 - val_loss: 0.6304 Epoch 57/100 1/1 - 0s - 249ms/step - accuracy: 0.7059 - loss: 1.2833 - val_accuracy: 0.7133 - val_loss: 0.6297 Epoch 58/100 1/1 - 0s - 91ms/step - accuracy: 0.7079 - loss: 1.2825 - val_accuracy: 0.7133 - val_loss: 0.6290 Epoch 59/100 1/1 - 0s - 141ms/step - accuracy: 0.7081 - loss: 1.2816 - val_accuracy: 0.7133 - val_loss: 0.6283 Epoch 60/100 1/1 - 0s - 94ms/step - accuracy: 0.7097 - loss: 1.2808 - val_accuracy: 0.7150 - val_loss: 0.6276 Epoch 61/100 1/1 - 0s - 92ms/step - accuracy: 0.7110 - loss: 1.2800 - val_accuracy: 0.7156 - val_loss: 0.6269 Epoch 62/100 1/1 - 0s - 174ms/step - accuracy: 0.7114 - loss: 1.2792 - val_accuracy: 0.7150 - val_loss: 0.6263 Epoch 63/100 1/1 - 0s - 118ms/step - accuracy: 0.7116 - loss: 1.2784 - val_accuracy: 0.7150 - val_loss: 0.6256 Epoch 64/100 1/1 - 0s - 95ms/step - accuracy: 0.7126 - loss: 1.2776 - val_accuracy: 0.7144 - val_loss: 0.6250 Epoch 65/100 1/1 - 0s - 97ms/step - accuracy: 0.7137 - loss: 1.2768 - val_accuracy: 0.7167 - val_loss: 0.6244 Epoch 66/100 1/1 - 0s - 93ms/step - accuracy: 0.7143 - loss: 1.2760 - val_accuracy: 0.7172 - val_loss: 0.6239 Epoch 67/100 1/1 - 0s - 143ms/step - accuracy: 0.7151 - loss: 1.2752 - val_accuracy: 0.7172 - val_loss: 0.6233 Epoch 68/100 1/1 - 0s - 98ms/step - accuracy: 0.7157 - loss: 1.2744 - val_accuracy: 0.7178 - val_loss: 0.6227 Epoch 69/100 1/1 - 0s - 98ms/step - accuracy: 0.7157 - loss: 1.2737 - val_accuracy: 0.7167 - val_loss: 0.6222 Epoch 70/100 1/1 - 0s - 136ms/step - accuracy: 0.7160 - loss: 1.2729 - val_accuracy: 0.7167 - val_loss: 0.6216 Epoch 71/100 1/1 - 0s - 111ms/step - accuracy: 0.7161 - loss: 1.2721 - val_accuracy: 0.7178 - val_loss: 0.6211 Epoch 72/100 1/1 - 0s - 112ms/step - accuracy: 0.7170 - loss: 1.2714 - val_accuracy: 0.7189 - val_loss: 0.6206 Epoch 73/100 1/1 - 0s - 138ms/step - accuracy: 0.7179 - loss: 1.2706 - val_accuracy: 0.7194 - val_loss: 0.6201 Epoch 74/100 1/1 - 0s - 99ms/step - accuracy: 0.7181 - loss: 1.2699 - val_accuracy: 0.7189 - val_loss: 0.6196 Epoch 75/100 1/1 - 0s - 134ms/step - accuracy: 0.7181 - loss: 1.2691 - val_accuracy: 0.7194 - val_loss: 0.6191 Epoch 76/100 1/1 - 0s - 97ms/step - accuracy: 0.7183 - loss: 1.2684 - val_accuracy: 0.7206 - val_loss: 0.6186 Epoch 77/100 1/1 - 0s - 97ms/step - accuracy: 0.7194 - loss: 1.2677 - val_accuracy: 0.7211 - val_loss: 0.6181 Epoch 78/100 1/1 - 0s - 97ms/step - accuracy: 0.7199 - loss: 1.2669 - val_accuracy: 0.7217 - val_loss: 0.6177 Epoch 79/100 1/1 - 0s - 95ms/step - accuracy: 0.7210 - loss: 1.2662 - val_accuracy: 0.7222 - val_loss: 0.6172 Epoch 80/100 1/1 - 0s - 119ms/step - accuracy: 0.7214 - loss: 1.2655 - val_accuracy: 0.7239 - val_loss: 0.6168 Epoch 81/100 1/1 - 0s - 110ms/step - accuracy: 0.7216 - loss: 1.2648 - val_accuracy: 0.7244 - val_loss: 0.6163 Epoch 82/100 1/1 - 0s - 140ms/step - accuracy: 0.7223 - loss: 1.2640 - val_accuracy: 0.7256 - val_loss: 0.6159 Epoch 83/100 1/1 - 0s - 90ms/step - accuracy: 0.7226 - loss: 1.2633 - val_accuracy: 0.7256 - val_loss: 0.6155 Epoch 84/100 1/1 - 0s - 141ms/step - accuracy: 0.7229 - loss: 1.2626 - val_accuracy: 0.7256 - val_loss: 0.6150 Epoch 85/100 1/1 - 0s - 177ms/step - accuracy: 0.7234 - loss: 1.2619 - val_accuracy: 0.7261 - val_loss: 0.6146 Epoch 86/100 1/1 - 0s - 336ms/step - accuracy: 0.7243 - loss: 1.2612 - val_accuracy: 0.7261 - val_loss: 0.6142 Epoch 87/100 1/1 - 0s - 161ms/step - accuracy: 0.7246 - loss: 1.2605 - val_accuracy: 0.7256 - val_loss: 0.6138 Epoch 88/100 1/1 - 0s - 176ms/step - accuracy: 0.7246 - loss: 1.2598 - val_accuracy: 0.7261 - val_loss: 0.6134 Epoch 89/100 1/1 - 0s - 140ms/step - accuracy: 0.7236 - loss: 1.2591 - val_accuracy: 0.7278 - val_loss: 0.6130 Epoch 90/100 1/1 - 0s - 167ms/step - accuracy: 0.7240 - loss: 1.2585 - val_accuracy: 0.7283 - val_loss: 0.6126 Epoch 91/100 1/1 - 0s - 293ms/step - accuracy: 0.7239 - loss: 1.2578 - val_accuracy: 0.7272 - val_loss: 0.6122 Epoch 92/100 1/1 - 1s - 855ms/step - accuracy: 0.7246 - loss: 1.2571 - val_accuracy: 0.7278 - val_loss: 0.6118 Epoch 93/100 1/1 - 1s - 694ms/step - accuracy: 0.7243 - loss: 1.2564 - val_accuracy: 0.7272 - val_loss: 0.6115 Epoch 94/100 1/1 - 0s - 286ms/step - accuracy: 0.7244 - loss: 1.2558 - val_accuracy: 0.7267 - val_loss: 0.6111 Epoch 95/100 1/1 - 0s - 119ms/step - accuracy: 0.7243 - loss: 1.2551 - val_accuracy: 0.7261 - val_loss: 0.6107 Epoch 96/100 1/1 - 0s - 136ms/step - accuracy: 0.7241 - loss: 1.2544 - val_accuracy: 0.7267 - val_loss: 0.6103 Epoch 97/100 1/1 - 0s - 98ms/step - accuracy: 0.7247 - loss: 1.2538 - val_accuracy: 0.7278 - val_loss: 0.6100 Epoch 98/100 1/1 - 0s - 130ms/step - accuracy: 0.7249 - loss: 1.2531 - val_accuracy: 0.7272 - val_loss: 0.6096 Epoch 99/100 1/1 - 0s - 91ms/step - accuracy: 0.7246 - loss: 1.2525 - val_accuracy: 0.7267 - val_loss: 0.6093 Epoch 100/100 1/1 - 0s - 92ms/step - accuracy: 0.7247 - loss: 1.2518 - val_accuracy: 0.7267 - val_loss: 0.6089
#Defining the columns of the dataframe which are nothing but the hyper parameters and the metrics.
columns = ["# hidden layers","# neurons - hidden layer","activation function - hidden layer ","# epochs","batch size","optimizer","learning rate, momentum","weight initializer","regularization","train loss","validation loss","train accuracy","validation accuracy","time (secs)"]
#Creating a pandas dataframe.
results = pd.DataFrame(columns=columns)
results.loc[0] = [2,[64,32],['relu','relu'],100,7000,"sgd",['-', "-"],"xavier","-",history.history["loss"][-1],history.history["val_loss"][-1],history.history["accuracy"][-1],history.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
Plotting Accuracy vs Epoch Curve
plot(history,'loss')
plot(history,'accuracy')
#observing model performance training data
model_performance_classification(model, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.725429 | 0.725429 | 0.789954 | 0.746624 |
#Model performance for validation data
model_performance_classification(model, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.726667 | 0.726667 | 0.792148 | 0.747172 |
Observations
- The training and validation accuracy increases as epoch increases.
- The validation accuracy is constant between ~80 to 100 epochs.
- The model is not giving good accuracy. At 100 epochs, the accuracy of the model on the training data is about 72.5% and the validation accuracy is 72.6%.
- The validation accuracy is closer to the training accuracy. This indicates that the model is giving a generalized performance.
- Training loss is more than validation loss that mean the model is making more mistakes during training.
- Generally, having a lower validation loss than training loss is a good sign. It suggests your model isn't overfitting and generalizes well.
- Recall for both training and validation is almost same.But we need to improve it.
- In General model is not performing well.
Model 1
- 2. Checking with Reducing batch size to 64
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#Initializing the neural network
model_1 = Sequential()
model_1.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model_1.add(Dense(32,activation="relu"))
model_1.add(Dense(1,activation="sigmoid"))
model_1.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model_1.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
start = time.time()
history_1 = model_1.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=100,class_weight=cw_dict, verbose=2,batch_size =64)
end=time.time()
Epoch 1/100 110/110 - 2s - 14ms/step - accuracy: 0.5029 - loss: 1.3373 - val_accuracy: 0.6639 - val_loss: 0.6474 Epoch 2/100 110/110 - 1s - 5ms/step - accuracy: 0.6991 - loss: 1.2368 - val_accuracy: 0.7189 - val_loss: 0.6076 Epoch 3/100 110/110 - 1s - 5ms/step - accuracy: 0.7219 - loss: 1.1885 - val_accuracy: 0.7322 - val_loss: 0.5805 Epoch 4/100 110/110 - 0s - 4ms/step - accuracy: 0.7284 - loss: 1.1635 - val_accuracy: 0.7089 - val_loss: 0.5910 Epoch 5/100 110/110 - 1s - 5ms/step - accuracy: 0.7174 - loss: 1.1513 - val_accuracy: 0.7444 - val_loss: 0.5426 Epoch 6/100 110/110 - 1s - 5ms/step - accuracy: 0.7233 - loss: 1.1413 - val_accuracy: 0.7306 - val_loss: 0.5561 Epoch 7/100 110/110 - 0s - 3ms/step - accuracy: 0.7184 - loss: 1.1340 - val_accuracy: 0.7467 - val_loss: 0.5316 Epoch 8/100 110/110 - 1s - 5ms/step - accuracy: 0.7193 - loss: 1.1271 - val_accuracy: 0.7344 - val_loss: 0.5400 Epoch 9/100 110/110 - 0s - 3ms/step - accuracy: 0.7187 - loss: 1.1208 - val_accuracy: 0.7300 - val_loss: 0.5384 Epoch 10/100 110/110 - 1s - 5ms/step - accuracy: 0.7177 - loss: 1.1142 - val_accuracy: 0.7144 - val_loss: 0.5522 Epoch 11/100 110/110 - 0s - 4ms/step - accuracy: 0.7189 - loss: 1.1079 - val_accuracy: 0.7144 - val_loss: 0.5487 Epoch 12/100 110/110 - 0s - 4ms/step - accuracy: 0.7181 - loss: 1.1030 - val_accuracy: 0.7222 - val_loss: 0.5401 Epoch 13/100 110/110 - 0s - 3ms/step - accuracy: 0.7199 - loss: 1.0965 - val_accuracy: 0.7356 - val_loss: 0.5161 Epoch 14/100 110/110 - 0s - 3ms/step - accuracy: 0.7223 - loss: 1.0908 - val_accuracy: 0.7111 - val_loss: 0.5510 Epoch 15/100 110/110 - 1s - 5ms/step - accuracy: 0.7229 - loss: 1.0837 - val_accuracy: 0.7144 - val_loss: 0.5460 Epoch 16/100 110/110 - 0s - 3ms/step - accuracy: 0.7266 - loss: 1.0765 - val_accuracy: 0.7106 - val_loss: 0.5507 Epoch 17/100 110/110 - 1s - 6ms/step - accuracy: 0.7244 - loss: 1.0711 - val_accuracy: 0.7250 - val_loss: 0.5387 Epoch 18/100 110/110 - 0s - 3ms/step - accuracy: 0.7313 - loss: 1.0636 - val_accuracy: 0.7372 - val_loss: 0.5139 Epoch 19/100 110/110 - 1s - 6ms/step - accuracy: 0.7344 - loss: 1.0576 - val_accuracy: 0.7322 - val_loss: 0.5301 Epoch 20/100 110/110 - 1s - 6ms/step - accuracy: 0.7399 - loss: 1.0494 - val_accuracy: 0.7272 - val_loss: 0.5348 Epoch 21/100 110/110 - 0s - 3ms/step - accuracy: 0.7414 - loss: 1.0438 - val_accuracy: 0.7517 - val_loss: 0.5017 Epoch 22/100 110/110 - 0s - 4ms/step - accuracy: 0.7473 - loss: 1.0374 - val_accuracy: 0.7389 - val_loss: 0.5228 Epoch 23/100 110/110 - 1s - 5ms/step - accuracy: 0.7516 - loss: 1.0293 - val_accuracy: 0.7517 - val_loss: 0.5017 Epoch 24/100 110/110 - 1s - 6ms/step - accuracy: 0.7551 - loss: 1.0235 - val_accuracy: 0.7539 - val_loss: 0.4952 Epoch 25/100 110/110 - 1s - 8ms/step - accuracy: 0.7566 - loss: 1.0160 - val_accuracy: 0.7500 - val_loss: 0.5004 Epoch 26/100 110/110 - 1s - 5ms/step - accuracy: 0.7591 - loss: 1.0098 - val_accuracy: 0.7417 - val_loss: 0.5103 Epoch 27/100 110/110 - 1s - 6ms/step - accuracy: 0.7623 - loss: 1.0035 - val_accuracy: 0.7411 - val_loss: 0.5102 Epoch 28/100 110/110 - 1s - 5ms/step - accuracy: 0.7629 - loss: 0.9960 - val_accuracy: 0.7589 - val_loss: 0.4872 Epoch 29/100 110/110 - 1s - 6ms/step - accuracy: 0.7694 - loss: 0.9907 - val_accuracy: 0.7267 - val_loss: 0.5296 Epoch 30/100 110/110 - 1s - 6ms/step - accuracy: 0.7689 - loss: 0.9841 - val_accuracy: 0.7433 - val_loss: 0.5024 Epoch 31/100 110/110 - 0s - 3ms/step - accuracy: 0.7696 - loss: 0.9795 - val_accuracy: 0.7661 - val_loss: 0.4790 Epoch 32/100 110/110 - 0s - 4ms/step - accuracy: 0.7719 - loss: 0.9737 - val_accuracy: 0.7372 - val_loss: 0.5073 Epoch 33/100 110/110 - 0s - 3ms/step - accuracy: 0.7749 - loss: 0.9697 - val_accuracy: 0.7428 - val_loss: 0.5021 Epoch 34/100 110/110 - 1s - 5ms/step - accuracy: 0.7776 - loss: 0.9623 - val_accuracy: 0.7200 - val_loss: 0.5328 Epoch 35/100 110/110 - 1s - 6ms/step - accuracy: 0.7710 - loss: 0.9599 - val_accuracy: 0.7728 - val_loss: 0.4726 Epoch 36/100 110/110 - 0s - 3ms/step - accuracy: 0.7774 - loss: 0.9554 - val_accuracy: 0.7589 - val_loss: 0.4821 Epoch 37/100 110/110 - 1s - 5ms/step - accuracy: 0.7774 - loss: 0.9525 - val_accuracy: 0.7750 - val_loss: 0.4656 Epoch 38/100 110/110 - 0s - 4ms/step - accuracy: 0.7781 - loss: 0.9461 - val_accuracy: 0.7694 - val_loss: 0.4698 Epoch 39/100 110/110 - 1s - 5ms/step - accuracy: 0.7810 - loss: 0.9434 - val_accuracy: 0.7650 - val_loss: 0.4779 Epoch 40/100 110/110 - 1s - 6ms/step - accuracy: 0.7807 - loss: 0.9403 - val_accuracy: 0.7561 - val_loss: 0.4893 Epoch 41/100 110/110 - 1s - 5ms/step - accuracy: 0.7814 - loss: 0.9366 - val_accuracy: 0.7683 - val_loss: 0.4718 Epoch 42/100 110/110 - 1s - 6ms/step - accuracy: 0.7821 - loss: 0.9335 - val_accuracy: 0.7672 - val_loss: 0.4691 Epoch 43/100 110/110 - 1s - 6ms/step - accuracy: 0.7869 - loss: 0.9311 - val_accuracy: 0.7633 - val_loss: 0.4801 Epoch 44/100 110/110 - 1s - 5ms/step - accuracy: 0.7860 - loss: 0.9271 - val_accuracy: 0.7622 - val_loss: 0.4758 Epoch 45/100 110/110 - 0s - 3ms/step - accuracy: 0.7867 - loss: 0.9227 - val_accuracy: 0.7739 - val_loss: 0.4611 Epoch 46/100 110/110 - 1s - 6ms/step - accuracy: 0.7874 - loss: 0.9185 - val_accuracy: 0.7678 - val_loss: 0.4725 Epoch 47/100 110/110 - 0s - 4ms/step - accuracy: 0.7854 - loss: 0.9184 - val_accuracy: 0.8039 - val_loss: 0.4226 Epoch 48/100 110/110 - 0s - 3ms/step - accuracy: 0.7919 - loss: 0.9152 - val_accuracy: 0.7667 - val_loss: 0.4690 Epoch 49/100 110/110 - 1s - 6ms/step - accuracy: 0.7866 - loss: 0.9132 - val_accuracy: 0.7939 - val_loss: 0.4347 Epoch 50/100 110/110 - 1s - 6ms/step - accuracy: 0.7943 - loss: 0.9105 - val_accuracy: 0.8006 - val_loss: 0.4259 Epoch 51/100 110/110 - 1s - 6ms/step - accuracy: 0.7941 - loss: 0.9084 - val_accuracy: 0.7839 - val_loss: 0.4504 Epoch 52/100 110/110 - 1s - 6ms/step - accuracy: 0.7950 - loss: 0.9054 - val_accuracy: 0.7544 - val_loss: 0.4961 Epoch 53/100 110/110 - 0s - 5ms/step - accuracy: 0.7927 - loss: 0.9040 - val_accuracy: 0.8039 - val_loss: 0.4259 Epoch 54/100 110/110 - 1s - 6ms/step - accuracy: 0.7973 - loss: 0.9001 - val_accuracy: 0.7667 - val_loss: 0.4670 Epoch 55/100 110/110 - 1s - 5ms/step - accuracy: 0.7949 - loss: 0.8982 - val_accuracy: 0.7928 - val_loss: 0.4369 Epoch 56/100 110/110 - 0s - 4ms/step - accuracy: 0.7991 - loss: 0.8952 - val_accuracy: 0.7894 - val_loss: 0.4412 Epoch 57/100 110/110 - 1s - 6ms/step - accuracy: 0.7974 - loss: 0.8929 - val_accuracy: 0.7989 - val_loss: 0.4289 Epoch 58/100 110/110 - 1s - 5ms/step - accuracy: 0.7987 - loss: 0.8914 - val_accuracy: 0.7306 - val_loss: 0.5268 Epoch 59/100 110/110 - 0s - 3ms/step - accuracy: 0.7959 - loss: 0.8894 - val_accuracy: 0.7767 - val_loss: 0.4598 Epoch 60/100 110/110 - 1s - 6ms/step - accuracy: 0.7980 - loss: 0.8874 - val_accuracy: 0.8061 - val_loss: 0.4219 Epoch 61/100 110/110 - 1s - 5ms/step - accuracy: 0.8020 - loss: 0.8847 - val_accuracy: 0.7756 - val_loss: 0.4608 Epoch 62/100 110/110 - 0s - 3ms/step - accuracy: 0.7993 - loss: 0.8832 - val_accuracy: 0.7556 - val_loss: 0.4839 Epoch 63/100 110/110 - 1s - 6ms/step - accuracy: 0.7987 - loss: 0.8835 - val_accuracy: 0.7833 - val_loss: 0.4463 Epoch 64/100 110/110 - 0s - 3ms/step - accuracy: 0.8024 - loss: 0.8797 - val_accuracy: 0.7456 - val_loss: 0.5023 Epoch 65/100 110/110 - 1s - 6ms/step - accuracy: 0.8026 - loss: 0.8800 - val_accuracy: 0.7772 - val_loss: 0.4568 Epoch 66/100 110/110 - 0s - 3ms/step - accuracy: 0.8017 - loss: 0.8757 - val_accuracy: 0.7917 - val_loss: 0.4383 Epoch 67/100 110/110 - 1s - 6ms/step - accuracy: 0.8037 - loss: 0.8764 - val_accuracy: 0.7933 - val_loss: 0.4354 Epoch 68/100 110/110 - 1s - 5ms/step - accuracy: 0.8039 - loss: 0.8744 - val_accuracy: 0.7961 - val_loss: 0.4387 Epoch 69/100 110/110 - 1s - 5ms/step - accuracy: 0.8021 - loss: 0.8735 - val_accuracy: 0.7922 - val_loss: 0.4352 Epoch 70/100 110/110 - 1s - 6ms/step - accuracy: 0.8023 - loss: 0.8710 - val_accuracy: 0.8100 - val_loss: 0.4165 Epoch 71/100 110/110 - 1s - 6ms/step - accuracy: 0.8056 - loss: 0.8715 - val_accuracy: 0.7972 - val_loss: 0.4331 Epoch 72/100 110/110 - 1s - 6ms/step - accuracy: 0.8030 - loss: 0.8697 - val_accuracy: 0.8183 - val_loss: 0.4009 Epoch 73/100 110/110 - 0s - 4ms/step - accuracy: 0.8094 - loss: 0.8673 - val_accuracy: 0.7556 - val_loss: 0.4831 Epoch 74/100 110/110 - 1s - 7ms/step - accuracy: 0.8069 - loss: 0.8669 - val_accuracy: 0.7889 - val_loss: 0.4480 Epoch 75/100 110/110 - 1s - 6ms/step - accuracy: 0.8067 - loss: 0.8641 - val_accuracy: 0.7817 - val_loss: 0.4564 Epoch 76/100 110/110 - 1s - 6ms/step - accuracy: 0.8044 - loss: 0.8624 - val_accuracy: 0.8017 - val_loss: 0.4241 Epoch 77/100 110/110 - 1s - 5ms/step - accuracy: 0.8067 - loss: 0.8622 - val_accuracy: 0.8078 - val_loss: 0.4186 Epoch 78/100 110/110 - 1s - 6ms/step - accuracy: 0.8070 - loss: 0.8617 - val_accuracy: 0.7883 - val_loss: 0.4463 Epoch 79/100 110/110 - 1s - 5ms/step - accuracy: 0.8100 - loss: 0.8624 - val_accuracy: 0.7917 - val_loss: 0.4418 Epoch 80/100 110/110 - 0s - 3ms/step - accuracy: 0.8041 - loss: 0.8597 - val_accuracy: 0.7289 - val_loss: 0.5292 Epoch 81/100 110/110 - 1s - 6ms/step - accuracy: 0.8064 - loss: 0.8586 - val_accuracy: 0.7856 - val_loss: 0.4446 Epoch 82/100 110/110 - 1s - 6ms/step - accuracy: 0.8040 - loss: 0.8589 - val_accuracy: 0.7956 - val_loss: 0.4370 Epoch 83/100 110/110 - 0s - 3ms/step - accuracy: 0.8110 - loss: 0.8526 - val_accuracy: 0.8222 - val_loss: 0.3958 Epoch 84/100 110/110 - 0s - 3ms/step - accuracy: 0.8101 - loss: 0.8555 - val_accuracy: 0.7950 - val_loss: 0.4402 Epoch 85/100 110/110 - 1s - 6ms/step - accuracy: 0.8123 - loss: 0.8518 - val_accuracy: 0.7717 - val_loss: 0.4630 Epoch 86/100 110/110 - 1s - 6ms/step - accuracy: 0.8069 - loss: 0.8525 - val_accuracy: 0.8133 - val_loss: 0.4102 Epoch 87/100 110/110 - 0s - 3ms/step - accuracy: 0.8110 - loss: 0.8507 - val_accuracy: 0.7839 - val_loss: 0.4471 Epoch 88/100 110/110 - 1s - 6ms/step - accuracy: 0.8099 - loss: 0.8514 - val_accuracy: 0.7850 - val_loss: 0.4472 Epoch 89/100 110/110 - 1s - 6ms/step - accuracy: 0.8089 - loss: 0.8492 - val_accuracy: 0.8233 - val_loss: 0.4039 Epoch 90/100 110/110 - 0s - 3ms/step - accuracy: 0.8109 - loss: 0.8478 - val_accuracy: 0.8100 - val_loss: 0.4219 Epoch 91/100 110/110 - 0s - 4ms/step - accuracy: 0.8146 - loss: 0.8467 - val_accuracy: 0.7572 - val_loss: 0.4828 Epoch 92/100 110/110 - 1s - 5ms/step - accuracy: 0.8113 - loss: 0.8473 - val_accuracy: 0.7989 - val_loss: 0.4326 Epoch 93/100 110/110 - 0s - 3ms/step - accuracy: 0.8100 - loss: 0.8439 - val_accuracy: 0.8017 - val_loss: 0.4235 Epoch 94/100 110/110 - 1s - 6ms/step - accuracy: 0.8103 - loss: 0.8433 - val_accuracy: 0.7922 - val_loss: 0.4359 Epoch 95/100 110/110 - 1s - 6ms/step - accuracy: 0.8119 - loss: 0.8436 - val_accuracy: 0.7667 - val_loss: 0.4712 Epoch 96/100 110/110 - 0s - 3ms/step - accuracy: 0.8087 - loss: 0.8444 - val_accuracy: 0.8122 - val_loss: 0.4146 Epoch 97/100 110/110 - 1s - 6ms/step - accuracy: 0.8131 - loss: 0.8417 - val_accuracy: 0.7911 - val_loss: 0.4353 Epoch 98/100 110/110 - 1s - 5ms/step - accuracy: 0.8103 - loss: 0.8409 - val_accuracy: 0.8244 - val_loss: 0.3952 Epoch 99/100 110/110 - 1s - 5ms/step - accuracy: 0.8150 - loss: 0.8396 - val_accuracy: 0.7511 - val_loss: 0.4937 Epoch 100/100 110/110 - 1s - 6ms/step - accuracy: 0.8117 - loss: 0.8413 - val_accuracy: 0.7744 - val_loss: 0.4631
#Plotting loss vs epoch
plot(history_1,'loss')
#Plotting Accuracy curve
plot(history_1,'accuracy')
results.loc[1] = [2,[64,32],['relu','relu'],100,64,"sgd",['-', "-"],"xavier","-",history_1.history["loss"][-1],history_1.history["val_loss"][-1],history_1.history["accuracy"][-1],history_1.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
#observing model performance training data
model_performance_classification(model_1, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.807429 | 0.807429 | 0.85745 | 0.821435 |
#Model performance for validation dataset
model_performance_classification(model_1, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.774444 | 0.774444 | 0.83196 | 0.79089 |
Observations
- By reducing the batch size,training and validation loss has reduced from 125% to 84% and 60% to 46% ,respectively and increased accuracy for both training and validation from 72% to 81% and 72.7% to 77% resp.which is improvement in model performance.
- But there are fluctuations in accuracy and loss which is indication of noise which is not good sign of good model.
Model 2
- 3. Reducing hidden layer:
- hidden layer with 64 neurons and 'relu' activation function
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#defining model with new parameters
model2 = Sequential()
model2.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model2.add(Dense(1,activation="sigmoid"))
#model summary
model2.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 1) │ 65 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 833 (3.25 KB)
Trainable params: 833 (3.25 KB)
Non-trainable params: 0 (0.00 B)
#compiling model
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model2.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#training model
start = time.time()
history2 = model2.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=100,class_weight=cw_dict, verbose=2,batch_size = 64)
end=time.time()
Epoch 1/100 110/110 - 2s - 21ms/step - accuracy: 0.7231 - loss: 1.2787 - val_accuracy: 0.6933 - val_loss: 0.6048 Epoch 2/100 110/110 - 1s - 10ms/step - accuracy: 0.7034 - loss: 1.2049 - val_accuracy: 0.6978 - val_loss: 0.5937 Epoch 3/100 110/110 - 1s - 5ms/step - accuracy: 0.7111 - loss: 1.1787 - val_accuracy: 0.7050 - val_loss: 0.5838 Epoch 4/100 110/110 - 1s - 6ms/step - accuracy: 0.7100 - loss: 1.1655 - val_accuracy: 0.7167 - val_loss: 0.5648 Epoch 5/100 110/110 - 1s - 6ms/step - accuracy: 0.7131 - loss: 1.1566 - val_accuracy: 0.7250 - val_loss: 0.5499 Epoch 6/100 110/110 - 1s - 5ms/step - accuracy: 0.7194 - loss: 1.1505 - val_accuracy: 0.7139 - val_loss: 0.5666 Epoch 7/100 110/110 - 1s - 5ms/step - accuracy: 0.7121 - loss: 1.1451 - val_accuracy: 0.7289 - val_loss: 0.5472 Epoch 8/100 110/110 - 0s - 4ms/step - accuracy: 0.7164 - loss: 1.1402 - val_accuracy: 0.7111 - val_loss: 0.5662 Epoch 9/100 110/110 - 1s - 5ms/step - accuracy: 0.7133 - loss: 1.1365 - val_accuracy: 0.7211 - val_loss: 0.5547 Epoch 10/100 110/110 - 0s - 4ms/step - accuracy: 0.7136 - loss: 1.1321 - val_accuracy: 0.7206 - val_loss: 0.5499 Epoch 11/100 110/110 - 0s - 4ms/step - accuracy: 0.7131 - loss: 1.1274 - val_accuracy: 0.7322 - val_loss: 0.5352 Epoch 12/100 110/110 - 1s - 5ms/step - accuracy: 0.7177 - loss: 1.1250 - val_accuracy: 0.7233 - val_loss: 0.5472 Epoch 13/100 110/110 - 1s - 5ms/step - accuracy: 0.7140 - loss: 1.1218 - val_accuracy: 0.7144 - val_loss: 0.5548 Epoch 14/100 110/110 - 0s - 3ms/step - accuracy: 0.7160 - loss: 1.1182 - val_accuracy: 0.7111 - val_loss: 0.5567 Epoch 15/100 110/110 - 1s - 6ms/step - accuracy: 0.7141 - loss: 1.1151 - val_accuracy: 0.7089 - val_loss: 0.5566 Epoch 16/100 110/110 - 0s - 4ms/step - accuracy: 0.7157 - loss: 1.1123 - val_accuracy: 0.7106 - val_loss: 0.5525 Epoch 17/100 110/110 - 1s - 5ms/step - accuracy: 0.7153 - loss: 1.1087 - val_accuracy: 0.7122 - val_loss: 0.5561 Epoch 18/100 110/110 - 0s - 3ms/step - accuracy: 0.7140 - loss: 1.1063 - val_accuracy: 0.7206 - val_loss: 0.5360 Epoch 19/100 110/110 - 1s - 7ms/step - accuracy: 0.7187 - loss: 1.1030 - val_accuracy: 0.7122 - val_loss: 0.5477 Epoch 20/100 110/110 - 1s - 5ms/step - accuracy: 0.7161 - loss: 1.1000 - val_accuracy: 0.7189 - val_loss: 0.5381 Epoch 21/100 110/110 - 1s - 5ms/step - accuracy: 0.7191 - loss: 1.0971 - val_accuracy: 0.7272 - val_loss: 0.5268 Epoch 22/100 110/110 - 1s - 6ms/step - accuracy: 0.7213 - loss: 1.0945 - val_accuracy: 0.7222 - val_loss: 0.5311 Epoch 23/100 110/110 - 1s - 5ms/step - accuracy: 0.7200 - loss: 1.0914 - val_accuracy: 0.7283 - val_loss: 0.5253 Epoch 24/100 110/110 - 0s - 3ms/step - accuracy: 0.7244 - loss: 1.0882 - val_accuracy: 0.7239 - val_loss: 0.5304 Epoch 25/100 110/110 - 0s - 3ms/step - accuracy: 0.7259 - loss: 1.0851 - val_accuracy: 0.7250 - val_loss: 0.5268 Epoch 26/100 110/110 - 0s - 3ms/step - accuracy: 0.7266 - loss: 1.0819 - val_accuracy: 0.7250 - val_loss: 0.5335 Epoch 27/100 110/110 - 1s - 5ms/step - accuracy: 0.7297 - loss: 1.0785 - val_accuracy: 0.7283 - val_loss: 0.5226 Epoch 28/100 110/110 - 1s - 6ms/step - accuracy: 0.7384 - loss: 1.0743 - val_accuracy: 0.7017 - val_loss: 0.5596 Epoch 29/100 110/110 - 0s - 3ms/step - accuracy: 0.7246 - loss: 1.0722 - val_accuracy: 0.7467 - val_loss: 0.5067 Epoch 30/100 110/110 - 0s - 3ms/step - accuracy: 0.7371 - loss: 1.0690 - val_accuracy: 0.7367 - val_loss: 0.5214 Epoch 31/100 110/110 - 1s - 6ms/step - accuracy: 0.7371 - loss: 1.0665 - val_accuracy: 0.7294 - val_loss: 0.5285 Epoch 32/100 110/110 - 1s - 5ms/step - accuracy: 0.7373 - loss: 1.0634 - val_accuracy: 0.7250 - val_loss: 0.5328 Epoch 33/100 110/110 - 0s - 3ms/step - accuracy: 0.7407 - loss: 1.0604 - val_accuracy: 0.7083 - val_loss: 0.5476 Epoch 34/100 110/110 - 0s - 3ms/step - accuracy: 0.7359 - loss: 1.0577 - val_accuracy: 0.7456 - val_loss: 0.5144 Epoch 35/100 110/110 - 0s - 3ms/step - accuracy: 0.7474 - loss: 1.0546 - val_accuracy: 0.7233 - val_loss: 0.5350 Epoch 36/100 110/110 - 0s - 4ms/step - accuracy: 0.7413 - loss: 1.0516 - val_accuracy: 0.7539 - val_loss: 0.5007 Epoch 37/100 110/110 - 0s - 3ms/step - accuracy: 0.7501 - loss: 1.0487 - val_accuracy: 0.7378 - val_loss: 0.5184 Epoch 38/100 110/110 - 0s - 4ms/step - accuracy: 0.7469 - loss: 1.0457 - val_accuracy: 0.7550 - val_loss: 0.4979 Epoch 39/100 110/110 - 1s - 5ms/step - accuracy: 0.7529 - loss: 1.0432 - val_accuracy: 0.7400 - val_loss: 0.5153 Epoch 40/100 110/110 - 0s - 3ms/step - accuracy: 0.7517 - loss: 1.0402 - val_accuracy: 0.7506 - val_loss: 0.5080 Epoch 41/100 110/110 - 0s - 3ms/step - accuracy: 0.7547 - loss: 1.0369 - val_accuracy: 0.7511 - val_loss: 0.5004 Epoch 42/100 110/110 - 0s - 4ms/step - accuracy: 0.7547 - loss: 1.0342 - val_accuracy: 0.7344 - val_loss: 0.5217 Epoch 43/100 110/110 - 1s - 6ms/step - accuracy: 0.7517 - loss: 1.0315 - val_accuracy: 0.7550 - val_loss: 0.4910 Epoch 44/100 110/110 - 1s - 5ms/step - accuracy: 0.7576 - loss: 1.0291 - val_accuracy: 0.7400 - val_loss: 0.5121 Epoch 45/100 110/110 - 0s - 3ms/step - accuracy: 0.7541 - loss: 1.0267 - val_accuracy: 0.7517 - val_loss: 0.4945 Epoch 46/100 110/110 - 0s - 3ms/step - accuracy: 0.7573 - loss: 1.0236 - val_accuracy: 0.7511 - val_loss: 0.4951 Epoch 47/100 110/110 - 1s - 6ms/step - accuracy: 0.7599 - loss: 1.0205 - val_accuracy: 0.7517 - val_loss: 0.4922 Epoch 48/100 110/110 - 1s - 6ms/step - accuracy: 0.7614 - loss: 1.0182 - val_accuracy: 0.7450 - val_loss: 0.4981 Epoch 49/100 110/110 - 1s - 6ms/step - accuracy: 0.7574 - loss: 1.0154 - val_accuracy: 0.7583 - val_loss: 0.4809 Epoch 50/100 110/110 - 1s - 6ms/step - accuracy: 0.7644 - loss: 1.0130 - val_accuracy: 0.7489 - val_loss: 0.4937 Epoch 51/100 110/110 - 1s - 5ms/step - accuracy: 0.7637 - loss: 1.0099 - val_accuracy: 0.7444 - val_loss: 0.5035 Epoch 52/100 110/110 - 1s - 5ms/step - accuracy: 0.7660 - loss: 1.0078 - val_accuracy: 0.7383 - val_loss: 0.5206 Epoch 53/100 110/110 - 1s - 5ms/step - accuracy: 0.7640 - loss: 1.0054 - val_accuracy: 0.7450 - val_loss: 0.5080 Epoch 54/100 110/110 - 1s - 5ms/step - accuracy: 0.7621 - loss: 1.0032 - val_accuracy: 0.7500 - val_loss: 0.4921 Epoch 55/100 110/110 - 0s - 3ms/step - accuracy: 0.7680 - loss: 1.0015 - val_accuracy: 0.7439 - val_loss: 0.5076 Epoch 56/100 110/110 - 1s - 5ms/step - accuracy: 0.7659 - loss: 0.9986 - val_accuracy: 0.7494 - val_loss: 0.4923 Epoch 57/100 110/110 - 0s - 3ms/step - accuracy: 0.7667 - loss: 0.9967 - val_accuracy: 0.7517 - val_loss: 0.4931 Epoch 58/100 110/110 - 1s - 6ms/step - accuracy: 0.7659 - loss: 0.9948 - val_accuracy: 0.7500 - val_loss: 0.4962 Epoch 59/100 110/110 - 0s - 3ms/step - accuracy: 0.7690 - loss: 0.9923 - val_accuracy: 0.7406 - val_loss: 0.5102 Epoch 60/100 110/110 - 1s - 6ms/step - accuracy: 0.7636 - loss: 0.9902 - val_accuracy: 0.7611 - val_loss: 0.4865 Epoch 61/100 110/110 - 0s - 3ms/step - accuracy: 0.7687 - loss: 0.9884 - val_accuracy: 0.7500 - val_loss: 0.5020 Epoch 62/100 110/110 - 0s - 3ms/step - accuracy: 0.7657 - loss: 0.9859 - val_accuracy: 0.7778 - val_loss: 0.4563 Epoch 63/100 110/110 - 0s - 3ms/step - accuracy: 0.7691 - loss: 0.9847 - val_accuracy: 0.7444 - val_loss: 0.5054 Epoch 64/100 110/110 - 1s - 6ms/step - accuracy: 0.7681 - loss: 0.9827 - val_accuracy: 0.7422 - val_loss: 0.5056 Epoch 65/100 110/110 - 0s - 3ms/step - accuracy: 0.7697 - loss: 0.9798 - val_accuracy: 0.7633 - val_loss: 0.4883 Epoch 66/100 110/110 - 0s - 3ms/step - accuracy: 0.7691 - loss: 0.9786 - val_accuracy: 0.7639 - val_loss: 0.4854 Epoch 67/100 110/110 - 0s - 3ms/step - accuracy: 0.7680 - loss: 0.9771 - val_accuracy: 0.7700 - val_loss: 0.4733 Epoch 68/100 110/110 - 0s - 3ms/step - accuracy: 0.7726 - loss: 0.9754 - val_accuracy: 0.7428 - val_loss: 0.5141 Epoch 69/100 110/110 - 0s - 3ms/step - accuracy: 0.7679 - loss: 0.9731 - val_accuracy: 0.7717 - val_loss: 0.4728 Epoch 70/100 110/110 - 0s - 3ms/step - accuracy: 0.7726 - loss: 0.9720 - val_accuracy: 0.7489 - val_loss: 0.5029 Epoch 71/100 110/110 - 0s - 3ms/step - accuracy: 0.7707 - loss: 0.9698 - val_accuracy: 0.7583 - val_loss: 0.4963 Epoch 72/100 110/110 - 1s - 5ms/step - accuracy: 0.7709 - loss: 0.9684 - val_accuracy: 0.7567 - val_loss: 0.4936 Epoch 73/100 110/110 - 0s - 4ms/step - accuracy: 0.7719 - loss: 0.9664 - val_accuracy: 0.7583 - val_loss: 0.4929 Epoch 74/100 110/110 - 0s - 3ms/step - accuracy: 0.7710 - loss: 0.9662 - val_accuracy: 0.7711 - val_loss: 0.4738 Epoch 75/100 110/110 - 0s - 3ms/step - accuracy: 0.7741 - loss: 0.9627 - val_accuracy: 0.7817 - val_loss: 0.4558 Epoch 76/100 110/110 - 1s - 5ms/step - accuracy: 0.7780 - loss: 0.9627 - val_accuracy: 0.7528 - val_loss: 0.5004 Epoch 77/100 110/110 - 0s - 4ms/step - accuracy: 0.7733 - loss: 0.9604 - val_accuracy: 0.7722 - val_loss: 0.4693 Epoch 78/100 110/110 - 1s - 6ms/step - accuracy: 0.7771 - loss: 0.9587 - val_accuracy: 0.7506 - val_loss: 0.5017 Epoch 79/100 110/110 - 0s - 4ms/step - accuracy: 0.7740 - loss: 0.9583 - val_accuracy: 0.7689 - val_loss: 0.4843 Epoch 80/100 110/110 - 1s - 6ms/step - accuracy: 0.7783 - loss: 0.9562 - val_accuracy: 0.7550 - val_loss: 0.4967 Epoch 81/100 110/110 - 1s - 6ms/step - accuracy: 0.7740 - loss: 0.9545 - val_accuracy: 0.7678 - val_loss: 0.4786 Epoch 82/100 110/110 - 0s - 5ms/step - accuracy: 0.7743 - loss: 0.9531 - val_accuracy: 0.7894 - val_loss: 0.4442 Epoch 83/100 110/110 - 1s - 5ms/step - accuracy: 0.7799 - loss: 0.9536 - val_accuracy: 0.7778 - val_loss: 0.4647 Epoch 84/100 110/110 - 1s - 5ms/step - accuracy: 0.7801 - loss: 0.9521 - val_accuracy: 0.7656 - val_loss: 0.4856 Epoch 85/100 110/110 - 0s - 3ms/step - accuracy: 0.7773 - loss: 0.9507 - val_accuracy: 0.7783 - val_loss: 0.4581 Epoch 86/100 110/110 - 0s - 3ms/step - accuracy: 0.7806 - loss: 0.9488 - val_accuracy: 0.7767 - val_loss: 0.4627 Epoch 87/100 110/110 - 1s - 6ms/step - accuracy: 0.7836 - loss: 0.9470 - val_accuracy: 0.7394 - val_loss: 0.5159 Epoch 88/100 110/110 - 0s - 3ms/step - accuracy: 0.7777 - loss: 0.9477 - val_accuracy: 0.7511 - val_loss: 0.5023 Epoch 89/100 110/110 - 1s - 6ms/step - accuracy: 0.7820 - loss: 0.9455 - val_accuracy: 0.7400 - val_loss: 0.5119 Epoch 90/100 110/110 - 1s - 6ms/step - accuracy: 0.7791 - loss: 0.9453 - val_accuracy: 0.7783 - val_loss: 0.4592 Epoch 91/100 110/110 - 0s - 3ms/step - accuracy: 0.7820 - loss: 0.9433 - val_accuracy: 0.7617 - val_loss: 0.4837 Epoch 92/100 110/110 - 0s - 4ms/step - accuracy: 0.7777 - loss: 0.9423 - val_accuracy: 0.7839 - val_loss: 0.4507 Epoch 93/100 110/110 - 1s - 6ms/step - accuracy: 0.7846 - loss: 0.9421 - val_accuracy: 0.7672 - val_loss: 0.4796 Epoch 94/100 110/110 - 1s - 5ms/step - accuracy: 0.7824 - loss: 0.9403 - val_accuracy: 0.7644 - val_loss: 0.4818 Epoch 95/100 110/110 - 0s - 4ms/step - accuracy: 0.7783 - loss: 0.9391 - val_accuracy: 0.7672 - val_loss: 0.4816 Epoch 96/100 110/110 - 1s - 5ms/step - accuracy: 0.7806 - loss: 0.9385 - val_accuracy: 0.7650 - val_loss: 0.4775 Epoch 97/100 110/110 - 0s - 3ms/step - accuracy: 0.7813 - loss: 0.9370 - val_accuracy: 0.7828 - val_loss: 0.4509 Epoch 98/100 110/110 - 1s - 5ms/step - accuracy: 0.7826 - loss: 0.9367 - val_accuracy: 0.7639 - val_loss: 0.4802 Epoch 99/100 110/110 - 0s - 3ms/step - accuracy: 0.7826 - loss: 0.9352 - val_accuracy: 0.7750 - val_loss: 0.4630 Epoch 100/100 110/110 - 1s - 6ms/step - accuracy: 0.7829 - loss: 0.9346 - val_accuracy: 0.7561 - val_loss: 0.4935
#plot accuracy vs epoach
plot(history2,'accuracy')
#plot loss vs epoch
plot(history2,'loss')
#defining result dataset
# results
results.loc[2] = [1,64,'relu',100,64,"sgd",['-', "-"],"xavier","-",history2.history["loss"][-1],history2.history["val_loss"][-1],history2.history["accuracy"][-1],history2.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
#Model performance for training dataset
model_performance_classification(model2, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.767571 | 0.767571 | 0.842814 | 0.787406 |
#model performance for validation dataset
model_performance_classification(model2, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.756111 | 0.756111 | 0.827613 | 0.775518 |
Observations
- Reducing the hidden layer,training and validation accuracy reduced with increase in loss and increased osscilations in both responses.
- Lets use two hidden layers with change in one activation function and chage in epoch size.
Model 3
We'll start out with a baseline model having the following configuration:
- 1 input, 2 hidden, 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#define model
model3 = Sequential()
model3.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model3.add(Dense(32,activation="tanh"))
model3.add(Dense(1,activation="sigmoid"))
model3.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#compile model
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model3.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#train model
start = time.time()
history3 = model3.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 64)
Epoch 1/150 110/110 - 2s - 16ms/step - accuracy: 0.6596 - loss: 1.3239 - val_accuracy: 0.6833 - val_loss: 0.6346 Epoch 2/150 110/110 - 0s - 4ms/step - accuracy: 0.7153 - loss: 1.2287 - val_accuracy: 0.7111 - val_loss: 0.6020 Epoch 3/150 110/110 - 1s - 5ms/step - accuracy: 0.7219 - loss: 1.1835 - val_accuracy: 0.7228 - val_loss: 0.5855 Epoch 4/150 110/110 - 1s - 6ms/step - accuracy: 0.7170 - loss: 1.1622 - val_accuracy: 0.7200 - val_loss: 0.5805 Epoch 5/150 110/110 - 1s - 5ms/step - accuracy: 0.7130 - loss: 1.1504 - val_accuracy: 0.7422 - val_loss: 0.5423 Epoch 6/150 110/110 - 0s - 3ms/step - accuracy: 0.7163 - loss: 1.1417 - val_accuracy: 0.7217 - val_loss: 0.5558 Epoch 7/150 110/110 - 1s - 5ms/step - accuracy: 0.7154 - loss: 1.1344 - val_accuracy: 0.7011 - val_loss: 0.5792 Epoch 8/150 110/110 - 0s - 3ms/step - accuracy: 0.7054 - loss: 1.1298 - val_accuracy: 0.7467 - val_loss: 0.5235 Epoch 9/150 110/110 - 1s - 5ms/step - accuracy: 0.7151 - loss: 1.1252 - val_accuracy: 0.7422 - val_loss: 0.5247 Epoch 10/150 110/110 - 0s - 4ms/step - accuracy: 0.7153 - loss: 1.1206 - val_accuracy: 0.7072 - val_loss: 0.5651 Epoch 11/150 110/110 - 1s - 7ms/step - accuracy: 0.7109 - loss: 1.1162 - val_accuracy: 0.7194 - val_loss: 0.5496 Epoch 12/150 110/110 - 1s - 5ms/step - accuracy: 0.7120 - loss: 1.1120 - val_accuracy: 0.7272 - val_loss: 0.5362 Epoch 13/150 110/110 - 1s - 6ms/step - accuracy: 0.7119 - loss: 1.1074 - val_accuracy: 0.7361 - val_loss: 0.5255 Epoch 14/150 110/110 - 1s - 6ms/step - accuracy: 0.7193 - loss: 1.1030 - val_accuracy: 0.7344 - val_loss: 0.5275 Epoch 15/150 110/110 - 1s - 6ms/step - accuracy: 0.7187 - loss: 1.0985 - val_accuracy: 0.7567 - val_loss: 0.5030 Epoch 16/150 110/110 - 0s - 4ms/step - accuracy: 0.7271 - loss: 1.0940 - val_accuracy: 0.7367 - val_loss: 0.5185 Epoch 17/150 110/110 - 0s - 4ms/step - accuracy: 0.7267 - loss: 1.0889 - val_accuracy: 0.7161 - val_loss: 0.5471 Epoch 18/150 110/110 - 0s - 3ms/step - accuracy: 0.7231 - loss: 1.0842 - val_accuracy: 0.7378 - val_loss: 0.5201 Epoch 19/150 110/110 - 0s - 4ms/step - accuracy: 0.7271 - loss: 1.0801 - val_accuracy: 0.7356 - val_loss: 0.5257 Epoch 20/150 110/110 - 1s - 6ms/step - accuracy: 0.7337 - loss: 1.0747 - val_accuracy: 0.7433 - val_loss: 0.5126 Epoch 21/150 110/110 - 1s - 5ms/step - accuracy: 0.7366 - loss: 1.0686 - val_accuracy: 0.7244 - val_loss: 0.5320 Epoch 22/150 110/110 - 0s - 3ms/step - accuracy: 0.7400 - loss: 1.0631 - val_accuracy: 0.6872 - val_loss: 0.5809 Epoch 23/150 110/110 - 0s - 3ms/step - accuracy: 0.7363 - loss: 1.0596 - val_accuracy: 0.7272 - val_loss: 0.5325 Epoch 24/150 110/110 - 1s - 6ms/step - accuracy: 0.7414 - loss: 1.0519 - val_accuracy: 0.7328 - val_loss: 0.5218 Epoch 25/150 110/110 - 1s - 5ms/step - accuracy: 0.7477 - loss: 1.0450 - val_accuracy: 0.7139 - val_loss: 0.5531 Epoch 26/150 110/110 - 0s - 3ms/step - accuracy: 0.7473 - loss: 1.0382 - val_accuracy: 0.7433 - val_loss: 0.5092 Epoch 27/150 110/110 - 1s - 5ms/step - accuracy: 0.7524 - loss: 1.0336 - val_accuracy: 0.7506 - val_loss: 0.4979 Epoch 28/150 110/110 - 0s - 3ms/step - accuracy: 0.7543 - loss: 1.0267 - val_accuracy: 0.7444 - val_loss: 0.5107 Epoch 29/150 110/110 - 0s - 3ms/step - accuracy: 0.7577 - loss: 1.0203 - val_accuracy: 0.7594 - val_loss: 0.4946 Epoch 30/150 110/110 - 0s - 3ms/step - accuracy: 0.7606 - loss: 1.0153 - val_accuracy: 0.7539 - val_loss: 0.4943 Epoch 31/150 110/110 - 1s - 6ms/step - accuracy: 0.7633 - loss: 1.0098 - val_accuracy: 0.7111 - val_loss: 0.5540 Epoch 32/150 110/110 - 1s - 5ms/step - accuracy: 0.7621 - loss: 1.0047 - val_accuracy: 0.7450 - val_loss: 0.5100 Epoch 33/150 110/110 - 0s - 3ms/step - accuracy: 0.7611 - loss: 0.9993 - val_accuracy: 0.7461 - val_loss: 0.5051 Epoch 34/150 110/110 - 1s - 5ms/step - accuracy: 0.7659 - loss: 0.9936 - val_accuracy: 0.7417 - val_loss: 0.5095 Epoch 35/150 110/110 - 0s - 3ms/step - accuracy: 0.7639 - loss: 0.9895 - val_accuracy: 0.7550 - val_loss: 0.4941 Epoch 36/150 110/110 - 0s - 3ms/step - accuracy: 0.7666 - loss: 0.9835 - val_accuracy: 0.7767 - val_loss: 0.4610 Epoch 37/150 110/110 - 1s - 5ms/step - accuracy: 0.7671 - loss: 0.9814 - val_accuracy: 0.7778 - val_loss: 0.4644 Epoch 38/150 110/110 - 0s - 4ms/step - accuracy: 0.7726 - loss: 0.9759 - val_accuracy: 0.7700 - val_loss: 0.4808 Epoch 39/150 110/110 - 1s - 6ms/step - accuracy: 0.7733 - loss: 0.9726 - val_accuracy: 0.7600 - val_loss: 0.4833 Epoch 40/150 110/110 - 0s - 4ms/step - accuracy: 0.7701 - loss: 0.9683 - val_accuracy: 0.7567 - val_loss: 0.4858 Epoch 41/150 110/110 - 1s - 6ms/step - accuracy: 0.7733 - loss: 0.9644 - val_accuracy: 0.7567 - val_loss: 0.4970 Epoch 42/150 110/110 - 1s - 5ms/step - accuracy: 0.7736 - loss: 0.9611 - val_accuracy: 0.7694 - val_loss: 0.4813 Epoch 43/150 110/110 - 1s - 6ms/step - accuracy: 0.7777 - loss: 0.9569 - val_accuracy: 0.7428 - val_loss: 0.5039 Epoch 44/150 110/110 - 0s - 3ms/step - accuracy: 0.7737 - loss: 0.9538 - val_accuracy: 0.7828 - val_loss: 0.4649 Epoch 45/150 110/110 - 0s - 3ms/step - accuracy: 0.7757 - loss: 0.9520 - val_accuracy: 0.7506 - val_loss: 0.4946 Epoch 46/150 110/110 - 1s - 5ms/step - accuracy: 0.7760 - loss: 0.9482 - val_accuracy: 0.7761 - val_loss: 0.4733 Epoch 47/150 110/110 - 0s - 4ms/step - accuracy: 0.7787 - loss: 0.9448 - val_accuracy: 0.7961 - val_loss: 0.4395 Epoch 48/150 110/110 - 0s - 3ms/step - accuracy: 0.7809 - loss: 0.9425 - val_accuracy: 0.7772 - val_loss: 0.4655 Epoch 49/150 110/110 - 1s - 6ms/step - accuracy: 0.7813 - loss: 0.9387 - val_accuracy: 0.7689 - val_loss: 0.4802 Epoch 50/150 110/110 - 1s - 6ms/step - accuracy: 0.7820 - loss: 0.9364 - val_accuracy: 0.7728 - val_loss: 0.4726 Epoch 51/150 110/110 - 1s - 6ms/step - accuracy: 0.7843 - loss: 0.9340 - val_accuracy: 0.7822 - val_loss: 0.4645 Epoch 52/150 110/110 - 0s - 3ms/step - accuracy: 0.7836 - loss: 0.9301 - val_accuracy: 0.7872 - val_loss: 0.4480 Epoch 53/150 110/110 - 1s - 5ms/step - accuracy: 0.7861 - loss: 0.9273 - val_accuracy: 0.7711 - val_loss: 0.4773 Epoch 54/150 110/110 - 0s - 3ms/step - accuracy: 0.7834 - loss: 0.9222 - val_accuracy: 0.7944 - val_loss: 0.4398 Epoch 55/150 110/110 - 1s - 6ms/step - accuracy: 0.7884 - loss: 0.9213 - val_accuracy: 0.7644 - val_loss: 0.4814 Epoch 56/150 110/110 - 1s - 5ms/step - accuracy: 0.7894 - loss: 0.9181 - val_accuracy: 0.7722 - val_loss: 0.4683 Epoch 57/150 110/110 - 1s - 5ms/step - accuracy: 0.7857 - loss: 0.9187 - val_accuracy: 0.7906 - val_loss: 0.4467 Epoch 58/150 110/110 - 1s - 5ms/step - accuracy: 0.7901 - loss: 0.9139 - val_accuracy: 0.7789 - val_loss: 0.4561 Epoch 59/150 110/110 - 1s - 6ms/step - accuracy: 0.7899 - loss: 0.9126 - val_accuracy: 0.7778 - val_loss: 0.4652 Epoch 60/150 110/110 - 0s - 3ms/step - accuracy: 0.7897 - loss: 0.9081 - val_accuracy: 0.7822 - val_loss: 0.4539 Epoch 61/150 110/110 - 0s - 3ms/step - accuracy: 0.7913 - loss: 0.9063 - val_accuracy: 0.7678 - val_loss: 0.4812 Epoch 62/150 110/110 - 0s - 3ms/step - accuracy: 0.7900 - loss: 0.9060 - val_accuracy: 0.7606 - val_loss: 0.4874 Epoch 63/150 110/110 - 1s - 6ms/step - accuracy: 0.7911 - loss: 0.9047 - val_accuracy: 0.7378 - val_loss: 0.5215 Epoch 64/150 110/110 - 1s - 7ms/step - accuracy: 0.7956 - loss: 0.8995 - val_accuracy: 0.7933 - val_loss: 0.4287 Epoch 65/150 110/110 - 1s - 6ms/step - accuracy: 0.7979 - loss: 0.8971 - val_accuracy: 0.7917 - val_loss: 0.4429 Epoch 66/150 110/110 - 1s - 5ms/step - accuracy: 0.7971 - loss: 0.8962 - val_accuracy: 0.7944 - val_loss: 0.4443 Epoch 67/150 110/110 - 1s - 6ms/step - accuracy: 0.7979 - loss: 0.8905 - val_accuracy: 0.7539 - val_loss: 0.4940 Epoch 68/150 110/110 - 1s - 5ms/step - accuracy: 0.7956 - loss: 0.8937 - val_accuracy: 0.8106 - val_loss: 0.4206 Epoch 69/150 110/110 - 1s - 5ms/step - accuracy: 0.7983 - loss: 0.8904 - val_accuracy: 0.8189 - val_loss: 0.3970 Epoch 70/150 110/110 - 0s - 3ms/step - accuracy: 0.7961 - loss: 0.8883 - val_accuracy: 0.7700 - val_loss: 0.4677 Epoch 71/150 110/110 - 1s - 6ms/step - accuracy: 0.7931 - loss: 0.8895 - val_accuracy: 0.7839 - val_loss: 0.4515 Epoch 72/150 110/110 - 1s - 6ms/step - accuracy: 0.7981 - loss: 0.8850 - val_accuracy: 0.7294 - val_loss: 0.5374 Epoch 73/150 110/110 - 1s - 10ms/step - accuracy: 0.7980 - loss: 0.8840 - val_accuracy: 0.7811 - val_loss: 0.4558 Epoch 74/150 110/110 - 1s - 5ms/step - accuracy: 0.7977 - loss: 0.8859 - val_accuracy: 0.7617 - val_loss: 0.4872 Epoch 75/150 110/110 - 1s - 8ms/step - accuracy: 0.8003 - loss: 0.8800 - val_accuracy: 0.7356 - val_loss: 0.5158 Epoch 76/150 110/110 - 1s - 12ms/step - accuracy: 0.8017 - loss: 0.8793 - val_accuracy: 0.7783 - val_loss: 0.4598 Epoch 77/150 110/110 - 1s - 12ms/step - accuracy: 0.7991 - loss: 0.8813 - val_accuracy: 0.7850 - val_loss: 0.4353 Epoch 78/150 110/110 - 1s - 10ms/step - accuracy: 0.8030 - loss: 0.8799 - val_accuracy: 0.7811 - val_loss: 0.4537 Epoch 79/150 110/110 - 1s - 10ms/step - accuracy: 0.8010 - loss: 0.8758 - val_accuracy: 0.7783 - val_loss: 0.4530 Epoch 80/150 110/110 - 1s - 6ms/step - accuracy: 0.8030 - loss: 0.8742 - val_accuracy: 0.7767 - val_loss: 0.4553 Epoch 81/150 110/110 - 1s - 7ms/step - accuracy: 0.8017 - loss: 0.8743 - val_accuracy: 0.7628 - val_loss: 0.4867 Epoch 82/150 110/110 - 2s - 14ms/step - accuracy: 0.7999 - loss: 0.8742 - val_accuracy: 0.8106 - val_loss: 0.4042 Epoch 83/150 110/110 - 2s - 14ms/step - accuracy: 0.8043 - loss: 0.8736 - val_accuracy: 0.8139 - val_loss: 0.4051 Epoch 84/150 110/110 - 2s - 18ms/step - accuracy: 0.8069 - loss: 0.8681 - val_accuracy: 0.8039 - val_loss: 0.4223 Epoch 85/150 110/110 - 1s - 10ms/step - accuracy: 0.8029 - loss: 0.8687 - val_accuracy: 0.7861 - val_loss: 0.4485 Epoch 86/150 110/110 - 1s - 8ms/step - accuracy: 0.8031 - loss: 0.8711 - val_accuracy: 0.8283 - val_loss: 0.3882 Epoch 87/150 110/110 - 1s - 7ms/step - accuracy: 0.8050 - loss: 0.8700 - val_accuracy: 0.7628 - val_loss: 0.4835 Epoch 88/150 110/110 - 1s - 6ms/step - accuracy: 0.8043 - loss: 0.8685 - val_accuracy: 0.7800 - val_loss: 0.4520 Epoch 89/150 110/110 - 0s - 4ms/step - accuracy: 0.8046 - loss: 0.8654 - val_accuracy: 0.8333 - val_loss: 0.3814 Epoch 90/150 110/110 - 1s - 5ms/step - accuracy: 0.8093 - loss: 0.8652 - val_accuracy: 0.7967 - val_loss: 0.4300 Epoch 91/150 110/110 - 1s - 7ms/step - accuracy: 0.8073 - loss: 0.8648 - val_accuracy: 0.7894 - val_loss: 0.4381 Epoch 92/150 110/110 - 1s - 5ms/step - accuracy: 0.8014 - loss: 0.8627 - val_accuracy: 0.8267 - val_loss: 0.3905 Epoch 93/150 110/110 - 0s - 4ms/step - accuracy: 0.8059 - loss: 0.8649 - val_accuracy: 0.7406 - val_loss: 0.5244 Epoch 94/150 110/110 - 1s - 5ms/step - accuracy: 0.8069 - loss: 0.8652 - val_accuracy: 0.7744 - val_loss: 0.4740 Epoch 95/150 110/110 - 0s - 3ms/step - accuracy: 0.8086 - loss: 0.8644 - val_accuracy: 0.7956 - val_loss: 0.4306 Epoch 96/150 110/110 - 1s - 6ms/step - accuracy: 0.8067 - loss: 0.8608 - val_accuracy: 0.8078 - val_loss: 0.4139 Epoch 97/150 110/110 - 1s - 6ms/step - accuracy: 0.8087 - loss: 0.8568 - val_accuracy: 0.8072 - val_loss: 0.4203 Epoch 98/150 110/110 - 1s - 6ms/step - accuracy: 0.8101 - loss: 0.8552 - val_accuracy: 0.7917 - val_loss: 0.4404 Epoch 99/150 110/110 - 1s - 6ms/step - accuracy: 0.8070 - loss: 0.8590 - val_accuracy: 0.8267 - val_loss: 0.3860 Epoch 100/150 110/110 - 1s - 5ms/step - accuracy: 0.8060 - loss: 0.8627 - val_accuracy: 0.8133 - val_loss: 0.4067 Epoch 101/150 110/110 - 1s - 6ms/step - accuracy: 0.8100 - loss: 0.8562 - val_accuracy: 0.7956 - val_loss: 0.4331 Epoch 102/150 110/110 - 1s - 5ms/step - accuracy: 0.8137 - loss: 0.8564 - val_accuracy: 0.7811 - val_loss: 0.4536 Epoch 103/150 110/110 - 1s - 5ms/step - accuracy: 0.8050 - loss: 0.8569 - val_accuracy: 0.8250 - val_loss: 0.3902 Epoch 104/150 110/110 - 0s - 4ms/step - accuracy: 0.8126 - loss: 0.8570 - val_accuracy: 0.7728 - val_loss: 0.4782 Epoch 105/150 110/110 - 1s - 5ms/step - accuracy: 0.8049 - loss: 0.8596 - val_accuracy: 0.8233 - val_loss: 0.3913 Epoch 106/150 110/110 - 1s - 6ms/step - accuracy: 0.8071 - loss: 0.8525 - val_accuracy: 0.8350 - val_loss: 0.3719 Epoch 107/150 110/110 - 1s - 6ms/step - accuracy: 0.8094 - loss: 0.8547 - val_accuracy: 0.7644 - val_loss: 0.4752 Epoch 108/150 110/110 - 1s - 5ms/step - accuracy: 0.8054 - loss: 0.8529 - val_accuracy: 0.8044 - val_loss: 0.4203 Epoch 109/150 110/110 - 1s - 5ms/step - accuracy: 0.8133 - loss: 0.8537 - val_accuracy: 0.8200 - val_loss: 0.3910 Epoch 110/150 110/110 - 1s - 6ms/step - accuracy: 0.8099 - loss: 0.8500 - val_accuracy: 0.8039 - val_loss: 0.4194 Epoch 111/150 110/110 - 1s - 5ms/step - accuracy: 0.8067 - loss: 0.8515 - val_accuracy: 0.7711 - val_loss: 0.4617 Epoch 112/150 110/110 - 0s - 3ms/step - accuracy: 0.8074 - loss: 0.8537 - val_accuracy: 0.8056 - val_loss: 0.4208 Epoch 113/150 110/110 - 0s - 3ms/step - accuracy: 0.8066 - loss: 0.8492 - val_accuracy: 0.8300 - val_loss: 0.3808 Epoch 114/150 110/110 - 1s - 6ms/step - accuracy: 0.8060 - loss: 0.8512 - val_accuracy: 0.8017 - val_loss: 0.4142 Epoch 115/150 110/110 - 1s - 6ms/step - accuracy: 0.8129 - loss: 0.8476 - val_accuracy: 0.7450 - val_loss: 0.5153 Epoch 116/150 110/110 - 1s - 6ms/step - accuracy: 0.8114 - loss: 0.8461 - val_accuracy: 0.7678 - val_loss: 0.4848 Epoch 117/150 110/110 - 1s - 5ms/step - accuracy: 0.8049 - loss: 0.8491 - val_accuracy: 0.7844 - val_loss: 0.4673 Epoch 118/150 110/110 - 1s - 6ms/step - accuracy: 0.8109 - loss: 0.8457 - val_accuracy: 0.8239 - val_loss: 0.3938 Epoch 119/150 110/110 - 0s - 3ms/step - accuracy: 0.8149 - loss: 0.8447 - val_accuracy: 0.8322 - val_loss: 0.3834 Epoch 120/150 110/110 - 1s - 5ms/step - accuracy: 0.8151 - loss: 0.8473 - val_accuracy: 0.7389 - val_loss: 0.5272 Epoch 121/150 110/110 - 1s - 6ms/step - accuracy: 0.8093 - loss: 0.8511 - val_accuracy: 0.7850 - val_loss: 0.4503 Epoch 122/150 110/110 - 1s - 7ms/step - accuracy: 0.8099 - loss: 0.8432 - val_accuracy: 0.7411 - val_loss: 0.5313 Epoch 123/150 110/110 - 1s - 5ms/step - accuracy: 0.8066 - loss: 0.8487 - val_accuracy: 0.7989 - val_loss: 0.4230 Epoch 124/150 110/110 - 1s - 5ms/step - accuracy: 0.8109 - loss: 0.8430 - val_accuracy: 0.8039 - val_loss: 0.4265 Epoch 125/150 110/110 - 1s - 6ms/step - accuracy: 0.8114 - loss: 0.8429 - val_accuracy: 0.7528 - val_loss: 0.4989 Epoch 126/150 110/110 - 1s - 9ms/step - accuracy: 0.8143 - loss: 0.8414 - val_accuracy: 0.7639 - val_loss: 0.4816 Epoch 127/150 110/110 - 1s - 6ms/step - accuracy: 0.8087 - loss: 0.8439 - val_accuracy: 0.7650 - val_loss: 0.4876 Epoch 128/150 110/110 - 1s - 5ms/step - accuracy: 0.8091 - loss: 0.8397 - val_accuracy: 0.7861 - val_loss: 0.4438 Epoch 129/150 110/110 - 0s - 3ms/step - accuracy: 0.8113 - loss: 0.8401 - val_accuracy: 0.8061 - val_loss: 0.4204 Epoch 130/150 110/110 - 1s - 6ms/step - accuracy: 0.8146 - loss: 0.8380 - val_accuracy: 0.8022 - val_loss: 0.4271 Epoch 131/150 110/110 - 1s - 6ms/step - accuracy: 0.8126 - loss: 0.8424 - val_accuracy: 0.7450 - val_loss: 0.5284 Epoch 132/150 110/110 - 1s - 6ms/step - accuracy: 0.8146 - loss: 0.8383 - val_accuracy: 0.7778 - val_loss: 0.4733 Epoch 133/150 110/110 - 1s - 6ms/step - accuracy: 0.8066 - loss: 0.8384 - val_accuracy: 0.8011 - val_loss: 0.4259 Epoch 134/150 110/110 - 0s - 3ms/step - accuracy: 0.8110 - loss: 0.8421 - val_accuracy: 0.8411 - val_loss: 0.3738 Epoch 135/150 110/110 - 0s - 3ms/step - accuracy: 0.8139 - loss: 0.8352 - val_accuracy: 0.8333 - val_loss: 0.3817 Epoch 136/150 110/110 - 1s - 6ms/step - accuracy: 0.8127 - loss: 0.8351 - val_accuracy: 0.7950 - val_loss: 0.4354 Epoch 137/150 110/110 - 0s - 3ms/step - accuracy: 0.8107 - loss: 0.8338 - val_accuracy: 0.8344 - val_loss: 0.3759 Epoch 138/150 110/110 - 0s - 3ms/step - accuracy: 0.8123 - loss: 0.8323 - val_accuracy: 0.8017 - val_loss: 0.4152 Epoch 139/150 110/110 - 0s - 3ms/step - accuracy: 0.8147 - loss: 0.8346 - val_accuracy: 0.7472 - val_loss: 0.5053 Epoch 140/150 110/110 - 0s - 3ms/step - accuracy: 0.8107 - loss: 0.8343 - val_accuracy: 0.8222 - val_loss: 0.3972 Epoch 141/150 110/110 - 0s - 3ms/step - accuracy: 0.8126 - loss: 0.8305 - val_accuracy: 0.8167 - val_loss: 0.3987 Epoch 142/150 110/110 - 1s - 6ms/step - accuracy: 0.8104 - loss: 0.8346 - val_accuracy: 0.7839 - val_loss: 0.4516 Epoch 143/150 110/110 - 1s - 6ms/step - accuracy: 0.8143 - loss: 0.8359 - val_accuracy: 0.7722 - val_loss: 0.4673 Epoch 144/150 110/110 - 1s - 6ms/step - accuracy: 0.8116 - loss: 0.8320 - val_accuracy: 0.8117 - val_loss: 0.4071 Epoch 145/150 110/110 - 0s - 3ms/step - accuracy: 0.8093 - loss: 0.8332 - val_accuracy: 0.8372 - val_loss: 0.3729 Epoch 146/150 110/110 - 0s - 4ms/step - accuracy: 0.8167 - loss: 0.8257 - val_accuracy: 0.7622 - val_loss: 0.4875 Epoch 147/150 110/110 - 1s - 6ms/step - accuracy: 0.8104 - loss: 0.8309 - val_accuracy: 0.7617 - val_loss: 0.4899 Epoch 148/150 110/110 - 1s - 5ms/step - accuracy: 0.8133 - loss: 0.8314 - val_accuracy: 0.8061 - val_loss: 0.4158 Epoch 149/150 110/110 - 1s - 5ms/step - accuracy: 0.8113 - loss: 0.8302 - val_accuracy: 0.7594 - val_loss: 0.4896 Epoch 150/150 110/110 - 1s - 6ms/step - accuracy: 0.8126 - loss: 0.8306 - val_accuracy: 0.8139 - val_loss: 0.4044
#Plot accuracy curve
plot(history3,'accuracy')
#plot loss curve
plot(history3,'loss')
#Observe all model performances
results.loc[3] = [2,[64,32],['relu','tanh'],150,64,"sgd",['-', "-"],"xavier","-",history3.history["loss"][-1],history3.history["val_loss"][-1],history3.history["accuracy"][-1],history3.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
#model performace for training data
model_performance_classification(model3, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.840143 | 0.840143 | 0.86045 | 0.847176 |
# model performance for validation data
model_performance_classification(model3, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.813889 | 0.813889 | 0.836804 | 0.822053 |
Observations
- Accuracy and loss improved over model 1 but there are oscillations in responses.
- Recall and F1_score improved.
Model 4
We'll start out with a baseline model having the following configuration:
- 1 input, 2 hidden(128,32), 1 output layers
- tanh and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#define model
model4 = Sequential()
model4.add(Dense(128,activation="tanh",input_dim=X_train.shape[1]))
model4.add(Dense(32,activation="tanh"))
model4.add(Dense(1,activation="sigmoid"))
#model summary
model4.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 128) │ 1,536 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 4,128 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 5,697 (22.25 KB)
Trainable params: 5,697 (22.25 KB)
Non-trainable params: 0 (0.00 B)
# compile model 4
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model4.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#Train the model 4
start = time.time()
history4 = model4.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 64)
Epoch 1/150 110/110 - 1s - 10ms/step - accuracy: 0.6979 - loss: 1.2387 - val_accuracy: 0.7489 - val_loss: 0.5582 Epoch 2/150 110/110 - 0s - 5ms/step - accuracy: 0.7259 - loss: 1.1736 - val_accuracy: 0.7106 - val_loss: 0.5841 Epoch 3/150 110/110 - 1s - 5ms/step - accuracy: 0.7176 - loss: 1.1633 - val_accuracy: 0.7328 - val_loss: 0.5608 Epoch 4/150 110/110 - 0s - 3ms/step - accuracy: 0.7136 - loss: 1.1561 - val_accuracy: 0.7267 - val_loss: 0.5603 Epoch 5/150 110/110 - 1s - 6ms/step - accuracy: 0.7130 - loss: 1.1522 - val_accuracy: 0.7422 - val_loss: 0.5389 Epoch 6/150 110/110 - 1s - 6ms/step - accuracy: 0.7133 - loss: 1.1504 - val_accuracy: 0.7061 - val_loss: 0.5808 Epoch 7/150 110/110 - 1s - 5ms/step - accuracy: 0.7104 - loss: 1.1457 - val_accuracy: 0.7444 - val_loss: 0.5322 Epoch 8/150 110/110 - 1s - 6ms/step - accuracy: 0.7134 - loss: 1.1423 - val_accuracy: 0.7467 - val_loss: 0.5310 Epoch 9/150 110/110 - 1s - 5ms/step - accuracy: 0.7151 - loss: 1.1377 - val_accuracy: 0.7256 - val_loss: 0.5528 Epoch 10/150 110/110 - 0s - 3ms/step - accuracy: 0.7161 - loss: 1.1342 - val_accuracy: 0.7067 - val_loss: 0.5710 Epoch 11/150 110/110 - 1s - 6ms/step - accuracy: 0.7094 - loss: 1.1294 - val_accuracy: 0.7394 - val_loss: 0.5323 Epoch 12/150 110/110 - 1s - 7ms/step - accuracy: 0.7159 - loss: 1.1239 - val_accuracy: 0.7150 - val_loss: 0.5604 Epoch 13/150 110/110 - 1s - 6ms/step - accuracy: 0.7107 - loss: 1.1196 - val_accuracy: 0.7544 - val_loss: 0.5214 Epoch 14/150 110/110 - 1s - 5ms/step - accuracy: 0.7179 - loss: 1.1142 - val_accuracy: 0.7333 - val_loss: 0.5335 Epoch 15/150 110/110 - 1s - 7ms/step - accuracy: 0.7116 - loss: 1.1116 - val_accuracy: 0.7267 - val_loss: 0.5384 Epoch 16/150 110/110 - 1s - 9ms/step - accuracy: 0.7150 - loss: 1.1053 - val_accuracy: 0.7394 - val_loss: 0.5253 Epoch 17/150 110/110 - 1s - 5ms/step - accuracy: 0.7177 - loss: 1.0998 - val_accuracy: 0.7444 - val_loss: 0.5144 Epoch 18/150 110/110 - 0s - 4ms/step - accuracy: 0.7219 - loss: 1.0941 - val_accuracy: 0.7283 - val_loss: 0.5334 Epoch 19/150 110/110 - 1s - 6ms/step - accuracy: 0.7181 - loss: 1.0906 - val_accuracy: 0.7528 - val_loss: 0.5096 Epoch 20/150 110/110 - 1s - 5ms/step - accuracy: 0.7236 - loss: 1.0843 - val_accuracy: 0.7350 - val_loss: 0.5265 Epoch 21/150 110/110 - 1s - 5ms/step - accuracy: 0.7270 - loss: 1.0769 - val_accuracy: 0.7672 - val_loss: 0.4839 Epoch 22/150 110/110 - 0s - 3ms/step - accuracy: 0.7269 - loss: 1.0725 - val_accuracy: 0.7767 - val_loss: 0.4834 Epoch 23/150 110/110 - 1s - 5ms/step - accuracy: 0.7343 - loss: 1.0669 - val_accuracy: 0.7411 - val_loss: 0.5167 Epoch 24/150 110/110 - 1s - 6ms/step - accuracy: 0.7387 - loss: 1.0598 - val_accuracy: 0.7228 - val_loss: 0.5461 Epoch 25/150 110/110 - 1s - 6ms/step - accuracy: 0.7403 - loss: 1.0538 - val_accuracy: 0.7022 - val_loss: 0.5658 Epoch 26/150 110/110 - 1s - 5ms/step - accuracy: 0.7391 - loss: 1.0486 - val_accuracy: 0.7494 - val_loss: 0.5056 Epoch 27/150 110/110 - 0s - 4ms/step - accuracy: 0.7463 - loss: 1.0413 - val_accuracy: 0.7450 - val_loss: 0.5197 Epoch 28/150 110/110 - 1s - 6ms/step - accuracy: 0.7521 - loss: 1.0355 - val_accuracy: 0.7394 - val_loss: 0.5241 Epoch 29/150 110/110 - 1s - 5ms/step - accuracy: 0.7517 - loss: 1.0284 - val_accuracy: 0.7539 - val_loss: 0.5080 Epoch 30/150 110/110 - 0s - 3ms/step - accuracy: 0.7551 - loss: 1.0227 - val_accuracy: 0.7550 - val_loss: 0.5009 Epoch 31/150 110/110 - 1s - 6ms/step - accuracy: 0.7587 - loss: 1.0176 - val_accuracy: 0.7611 - val_loss: 0.4852 Epoch 32/150 110/110 - 0s - 4ms/step - accuracy: 0.7677 - loss: 1.0113 - val_accuracy: 0.7394 - val_loss: 0.5146 Epoch 33/150 110/110 - 1s - 5ms/step - accuracy: 0.7631 - loss: 1.0052 - val_accuracy: 0.7550 - val_loss: 0.4969 Epoch 34/150 110/110 - 1s - 8ms/step - accuracy: 0.7614 - loss: 1.0012 - val_accuracy: 0.7683 - val_loss: 0.4879 Epoch 35/150 110/110 - 1s - 5ms/step - accuracy: 0.7660 - loss: 0.9966 - val_accuracy: 0.7739 - val_loss: 0.4721 Epoch 36/150 110/110 - 1s - 5ms/step - accuracy: 0.7669 - loss: 0.9895 - val_accuracy: 0.7939 - val_loss: 0.4558 Epoch 37/150 110/110 - 1s - 5ms/step - accuracy: 0.7743 - loss: 0.9861 - val_accuracy: 0.7683 - val_loss: 0.4832 Epoch 38/150 110/110 - 1s - 6ms/step - accuracy: 0.7736 - loss: 0.9818 - val_accuracy: 0.7317 - val_loss: 0.5297 Epoch 39/150 110/110 - 1s - 5ms/step - accuracy: 0.7726 - loss: 0.9792 - val_accuracy: 0.7589 - val_loss: 0.4966 Epoch 40/150 110/110 - 0s - 4ms/step - accuracy: 0.7756 - loss: 0.9755 - val_accuracy: 0.7617 - val_loss: 0.4954 Epoch 41/150 110/110 - 0s - 4ms/step - accuracy: 0.7706 - loss: 0.9708 - val_accuracy: 0.7889 - val_loss: 0.4553 Epoch 42/150 110/110 - 1s - 5ms/step - accuracy: 0.7770 - loss: 0.9682 - val_accuracy: 0.7644 - val_loss: 0.4935 Epoch 43/150 110/110 - 0s - 4ms/step - accuracy: 0.7743 - loss: 0.9646 - val_accuracy: 0.7889 - val_loss: 0.4536 Epoch 44/150 110/110 - 0s - 3ms/step - accuracy: 0.7769 - loss: 0.9603 - val_accuracy: 0.7672 - val_loss: 0.4851 Epoch 45/150 110/110 - 1s - 6ms/step - accuracy: 0.7804 - loss: 0.9587 - val_accuracy: 0.7722 - val_loss: 0.4762 Epoch 46/150 110/110 - 0s - 3ms/step - accuracy: 0.7780 - loss: 0.9557 - val_accuracy: 0.7828 - val_loss: 0.4666 Epoch 47/150 110/110 - 0s - 3ms/step - accuracy: 0.7780 - loss: 0.9543 - val_accuracy: 0.7767 - val_loss: 0.4756 Epoch 48/150 110/110 - 0s - 3ms/step - accuracy: 0.7816 - loss: 0.9517 - val_accuracy: 0.7639 - val_loss: 0.4858 Epoch 49/150 110/110 - 0s - 3ms/step - accuracy: 0.7789 - loss: 0.9494 - val_accuracy: 0.7733 - val_loss: 0.4796 Epoch 50/150 110/110 - 0s - 4ms/step - accuracy: 0.7839 - loss: 0.9480 - val_accuracy: 0.7744 - val_loss: 0.4709 Epoch 51/150 110/110 - 1s - 5ms/step - accuracy: 0.7833 - loss: 0.9463 - val_accuracy: 0.7906 - val_loss: 0.4465 Epoch 52/150 110/110 - 0s - 4ms/step - accuracy: 0.7833 - loss: 0.9425 - val_accuracy: 0.7550 - val_loss: 0.5021 Epoch 53/150 110/110 - 0s - 3ms/step - accuracy: 0.7789 - loss: 0.9421 - val_accuracy: 0.7789 - val_loss: 0.4642 Epoch 54/150 110/110 - 0s - 3ms/step - accuracy: 0.7853 - loss: 0.9391 - val_accuracy: 0.7383 - val_loss: 0.5210 Epoch 55/150 110/110 - 0s - 3ms/step - accuracy: 0.7841 - loss: 0.9379 - val_accuracy: 0.7900 - val_loss: 0.4492 Epoch 56/150 110/110 - 0s - 4ms/step - accuracy: 0.7841 - loss: 0.9377 - val_accuracy: 0.7617 - val_loss: 0.4961 Epoch 57/150 110/110 - 1s - 5ms/step - accuracy: 0.7846 - loss: 0.9349 - val_accuracy: 0.7956 - val_loss: 0.4393 Epoch 58/150 110/110 - 1s - 6ms/step - accuracy: 0.7870 - loss: 0.9348 - val_accuracy: 0.7383 - val_loss: 0.5230 Epoch 59/150 110/110 - 1s - 5ms/step - accuracy: 0.7864 - loss: 0.9324 - val_accuracy: 0.7600 - val_loss: 0.4921 Epoch 60/150 110/110 - 1s - 6ms/step - accuracy: 0.7840 - loss: 0.9303 - val_accuracy: 0.7733 - val_loss: 0.4684 Epoch 61/150 110/110 - 1s - 5ms/step - accuracy: 0.7879 - loss: 0.9296 - val_accuracy: 0.7772 - val_loss: 0.4647 Epoch 62/150 110/110 - 1s - 8ms/step - accuracy: 0.7877 - loss: 0.9296 - val_accuracy: 0.7606 - val_loss: 0.4931 Epoch 63/150 110/110 - 1s - 5ms/step - accuracy: 0.7851 - loss: 0.9281 - val_accuracy: 0.7944 - val_loss: 0.4395 Epoch 64/150 110/110 - 1s - 5ms/step - accuracy: 0.7851 - loss: 0.9290 - val_accuracy: 0.7772 - val_loss: 0.4686 Epoch 65/150 110/110 - 1s - 5ms/step - accuracy: 0.7874 - loss: 0.9252 - val_accuracy: 0.7539 - val_loss: 0.4995 Epoch 66/150 110/110 - 1s - 6ms/step - accuracy: 0.7881 - loss: 0.9250 - val_accuracy: 0.7856 - val_loss: 0.4550 Epoch 67/150 110/110 - 0s - 4ms/step - accuracy: 0.7889 - loss: 0.9226 - val_accuracy: 0.7639 - val_loss: 0.4814 Epoch 68/150 110/110 - 0s - 4ms/step - accuracy: 0.7889 - loss: 0.9215 - val_accuracy: 0.7467 - val_loss: 0.5081 Epoch 69/150 110/110 - 1s - 5ms/step - accuracy: 0.7864 - loss: 0.9212 - val_accuracy: 0.7761 - val_loss: 0.4677 Epoch 70/150 110/110 - 0s - 3ms/step - accuracy: 0.7900 - loss: 0.9194 - val_accuracy: 0.7856 - val_loss: 0.4505 Epoch 71/150 110/110 - 1s - 5ms/step - accuracy: 0.7906 - loss: 0.9168 - val_accuracy: 0.7744 - val_loss: 0.4645 Epoch 72/150 110/110 - 1s - 6ms/step - accuracy: 0.7891 - loss: 0.9168 - val_accuracy: 0.7806 - val_loss: 0.4603 Epoch 73/150 110/110 - 1s - 6ms/step - accuracy: 0.7896 - loss: 0.9163 - val_accuracy: 0.7789 - val_loss: 0.4615 Epoch 74/150 110/110 - 1s - 5ms/step - accuracy: 0.7896 - loss: 0.9170 - val_accuracy: 0.7706 - val_loss: 0.4717 Epoch 75/150 110/110 - 0s - 3ms/step - accuracy: 0.7906 - loss: 0.9144 - val_accuracy: 0.7428 - val_loss: 0.5065 Epoch 76/150 110/110 - 0s - 3ms/step - accuracy: 0.7897 - loss: 0.9133 - val_accuracy: 0.7767 - val_loss: 0.4606 Epoch 77/150 110/110 - 0s - 3ms/step - accuracy: 0.7916 - loss: 0.9142 - val_accuracy: 0.7644 - val_loss: 0.4812 Epoch 78/150 110/110 - 1s - 6ms/step - accuracy: 0.7880 - loss: 0.9109 - val_accuracy: 0.7889 - val_loss: 0.4457 Epoch 79/150 110/110 - 1s - 6ms/step - accuracy: 0.7917 - loss: 0.9098 - val_accuracy: 0.7700 - val_loss: 0.4725 Epoch 80/150 110/110 - 1s - 5ms/step - accuracy: 0.7907 - loss: 0.9096 - val_accuracy: 0.8006 - val_loss: 0.4224 Epoch 81/150 110/110 - 1s - 6ms/step - accuracy: 0.7939 - loss: 0.9103 - val_accuracy: 0.7611 - val_loss: 0.4853 Epoch 82/150 110/110 - 1s - 6ms/step - accuracy: 0.7944 - loss: 0.9076 - val_accuracy: 0.7800 - val_loss: 0.4577 Epoch 83/150 110/110 - 1s - 6ms/step - accuracy: 0.7951 - loss: 0.9057 - val_accuracy: 0.7356 - val_loss: 0.5180 Epoch 84/150 110/110 - 0s - 3ms/step - accuracy: 0.7934 - loss: 0.9066 - val_accuracy: 0.7756 - val_loss: 0.4627 Epoch 85/150 110/110 - 1s - 6ms/step - accuracy: 0.7933 - loss: 0.9067 - val_accuracy: 0.7761 - val_loss: 0.4608 Epoch 86/150 110/110 - 0s - 4ms/step - accuracy: 0.7933 - loss: 0.9032 - val_accuracy: 0.7661 - val_loss: 0.4795 Epoch 87/150 110/110 - 1s - 6ms/step - accuracy: 0.7953 - loss: 0.9037 - val_accuracy: 0.7772 - val_loss: 0.4620 Epoch 88/150 110/110 - 1s - 6ms/step - accuracy: 0.7930 - loss: 0.9018 - val_accuracy: 0.7817 - val_loss: 0.4535 Epoch 89/150 110/110 - 1s - 5ms/step - accuracy: 0.7967 - loss: 0.9005 - val_accuracy: 0.7400 - val_loss: 0.5120 Epoch 90/150 110/110 - 1s - 6ms/step - accuracy: 0.7957 - loss: 0.8995 - val_accuracy: 0.7628 - val_loss: 0.4828 Epoch 91/150 110/110 - 1s - 5ms/step - accuracy: 0.7954 - loss: 0.9014 - val_accuracy: 0.7722 - val_loss: 0.4686 Epoch 92/150 110/110 - 0s - 4ms/step - accuracy: 0.7963 - loss: 0.8995 - val_accuracy: 0.7883 - val_loss: 0.4376 Epoch 93/150 110/110 - 0s - 4ms/step - accuracy: 0.7976 - loss: 0.8986 - val_accuracy: 0.7828 - val_loss: 0.4487 Epoch 94/150 110/110 - 0s - 3ms/step - accuracy: 0.7993 - loss: 0.8973 - val_accuracy: 0.7628 - val_loss: 0.4817 Epoch 95/150 110/110 - 1s - 6ms/step - accuracy: 0.7980 - loss: 0.8972 - val_accuracy: 0.7750 - val_loss: 0.4635 Epoch 96/150 110/110 - 0s - 3ms/step - accuracy: 0.7974 - loss: 0.8947 - val_accuracy: 0.7344 - val_loss: 0.5143 Epoch 97/150 110/110 - 0s - 3ms/step - accuracy: 0.7976 - loss: 0.8970 - val_accuracy: 0.7839 - val_loss: 0.4461 Epoch 98/150 110/110 - 1s - 5ms/step - accuracy: 0.7947 - loss: 0.8905 - val_accuracy: 0.7750 - val_loss: 0.4641 Epoch 99/150 110/110 - 0s - 4ms/step - accuracy: 0.7959 - loss: 0.8955 - val_accuracy: 0.8022 - val_loss: 0.4224 Epoch 100/150 110/110 - 1s - 5ms/step - accuracy: 0.8004 - loss: 0.8935 - val_accuracy: 0.7761 - val_loss: 0.4602 Epoch 101/150 110/110 - 0s - 3ms/step - accuracy: 0.7961 - loss: 0.8915 - val_accuracy: 0.7911 - val_loss: 0.4401 Epoch 102/150 110/110 - 1s - 6ms/step - accuracy: 0.7987 - loss: 0.8928 - val_accuracy: 0.7811 - val_loss: 0.4439 Epoch 103/150 110/110 - 1s - 5ms/step - accuracy: 0.7990 - loss: 0.8887 - val_accuracy: 0.7972 - val_loss: 0.4267 Epoch 104/150 110/110 - 0s - 3ms/step - accuracy: 0.7947 - loss: 0.8891 - val_accuracy: 0.7744 - val_loss: 0.4570 Epoch 105/150 110/110 - 1s - 5ms/step - accuracy: 0.7986 - loss: 0.8888 - val_accuracy: 0.7700 - val_loss: 0.4703 Epoch 106/150 110/110 - 0s - 3ms/step - accuracy: 0.7980 - loss: 0.8881 - val_accuracy: 0.7756 - val_loss: 0.4560 Epoch 107/150 110/110 - 0s - 3ms/step - accuracy: 0.7990 - loss: 0.8871 - val_accuracy: 0.7678 - val_loss: 0.4714 Epoch 108/150 110/110 - 0s - 3ms/step - accuracy: 0.7993 - loss: 0.8873 - val_accuracy: 0.7567 - val_loss: 0.4931 Epoch 109/150 110/110 - 1s - 6ms/step - accuracy: 0.7970 - loss: 0.8866 - val_accuracy: 0.8022 - val_loss: 0.4264 Epoch 110/150 110/110 - 0s - 3ms/step - accuracy: 0.8043 - loss: 0.8850 - val_accuracy: 0.7922 - val_loss: 0.4366 Epoch 111/150 110/110 - 1s - 6ms/step - accuracy: 0.8029 - loss: 0.8840 - val_accuracy: 0.7700 - val_loss: 0.4720 Epoch 112/150 110/110 - 1s - 6ms/step - accuracy: 0.7990 - loss: 0.8835 - val_accuracy: 0.7944 - val_loss: 0.4344 Epoch 113/150 110/110 - 1s - 5ms/step - accuracy: 0.8051 - loss: 0.8832 - val_accuracy: 0.7906 - val_loss: 0.4395 Epoch 114/150 110/110 - 1s - 5ms/step - accuracy: 0.8041 - loss: 0.8803 - val_accuracy: 0.7678 - val_loss: 0.4725 Epoch 115/150 110/110 - 1s - 6ms/step - accuracy: 0.8024 - loss: 0.8810 - val_accuracy: 0.8156 - val_loss: 0.4059 Epoch 116/150 110/110 - 1s - 5ms/step - accuracy: 0.8050 - loss: 0.8799 - val_accuracy: 0.7767 - val_loss: 0.4557 Epoch 117/150 110/110 - 1s - 6ms/step - accuracy: 0.8016 - loss: 0.8803 - val_accuracy: 0.7617 - val_loss: 0.4799 Epoch 118/150 110/110 - 0s - 4ms/step - accuracy: 0.7993 - loss: 0.8803 - val_accuracy: 0.7850 - val_loss: 0.4455 Epoch 119/150 110/110 - 0s - 4ms/step - accuracy: 0.8009 - loss: 0.8790 - val_accuracy: 0.8083 - val_loss: 0.4200 Epoch 120/150 110/110 - 1s - 5ms/step - accuracy: 0.8029 - loss: 0.8779 - val_accuracy: 0.8267 - val_loss: 0.3927 Epoch 121/150 110/110 - 1s - 6ms/step - accuracy: 0.8086 - loss: 0.8768 - val_accuracy: 0.7806 - val_loss: 0.4585 Epoch 122/150 110/110 - 1s - 6ms/step - accuracy: 0.8007 - loss: 0.8766 - val_accuracy: 0.7578 - val_loss: 0.4838 Epoch 123/150 110/110 - 1s - 6ms/step - accuracy: 0.8033 - loss: 0.8755 - val_accuracy: 0.7644 - val_loss: 0.4765 Epoch 124/150 110/110 - 1s - 6ms/step - accuracy: 0.8046 - loss: 0.8761 - val_accuracy: 0.8050 - val_loss: 0.4236 Epoch 125/150 110/110 - 0s - 3ms/step - accuracy: 0.8040 - loss: 0.8768 - val_accuracy: 0.7761 - val_loss: 0.4609 Epoch 126/150 110/110 - 0s - 3ms/step - accuracy: 0.8063 - loss: 0.8747 - val_accuracy: 0.7656 - val_loss: 0.4764 Epoch 127/150 110/110 - 0s - 4ms/step - accuracy: 0.8027 - loss: 0.8720 - val_accuracy: 0.7689 - val_loss: 0.4750 Epoch 128/150 110/110 - 0s - 4ms/step - accuracy: 0.8011 - loss: 0.8735 - val_accuracy: 0.8011 - val_loss: 0.4342 Epoch 129/150 110/110 - 0s - 4ms/step - accuracy: 0.8063 - loss: 0.8741 - val_accuracy: 0.7661 - val_loss: 0.4759 Epoch 130/150 110/110 - 1s - 5ms/step - accuracy: 0.8034 - loss: 0.8729 - val_accuracy: 0.8006 - val_loss: 0.4237 Epoch 131/150 110/110 - 1s - 6ms/step - accuracy: 0.8040 - loss: 0.8727 - val_accuracy: 0.7878 - val_loss: 0.4417 Epoch 132/150 110/110 - 0s - 4ms/step - accuracy: 0.8041 - loss: 0.8710 - val_accuracy: 0.7950 - val_loss: 0.4347 Epoch 133/150 110/110 - 1s - 6ms/step - accuracy: 0.8076 - loss: 0.8692 - val_accuracy: 0.7883 - val_loss: 0.4431 Epoch 134/150 110/110 - 1s - 5ms/step - accuracy: 0.8081 - loss: 0.8702 - val_accuracy: 0.7517 - val_loss: 0.4915 Epoch 135/150 110/110 - 1s - 6ms/step - accuracy: 0.8061 - loss: 0.8680 - val_accuracy: 0.7828 - val_loss: 0.4508 Epoch 136/150 110/110 - 0s - 4ms/step - accuracy: 0.8079 - loss: 0.8671 - val_accuracy: 0.8011 - val_loss: 0.4279 Epoch 137/150 110/110 - 1s - 7ms/step - accuracy: 0.8079 - loss: 0.8688 - val_accuracy: 0.7994 - val_loss: 0.4303 Epoch 138/150 110/110 - 1s - 5ms/step - accuracy: 0.8089 - loss: 0.8659 - val_accuracy: 0.7933 - val_loss: 0.4379 Epoch 139/150 110/110 - 1s - 5ms/step - accuracy: 0.8096 - loss: 0.8652 - val_accuracy: 0.7683 - val_loss: 0.4702 Epoch 140/150 110/110 - 1s - 5ms/step - accuracy: 0.8069 - loss: 0.8660 - val_accuracy: 0.7944 - val_loss: 0.4388 Epoch 141/150 110/110 - 1s - 5ms/step - accuracy: 0.8099 - loss: 0.8655 - val_accuracy: 0.7578 - val_loss: 0.4936 Epoch 142/150 110/110 - 1s - 13ms/step - accuracy: 0.8071 - loss: 0.8641 - val_accuracy: 0.7600 - val_loss: 0.4803 Epoch 143/150 110/110 - 0s - 3ms/step - accuracy: 0.8051 - loss: 0.8631 - val_accuracy: 0.8000 - val_loss: 0.4341 Epoch 144/150 110/110 - 0s - 3ms/step - accuracy: 0.8057 - loss: 0.8643 - val_accuracy: 0.7750 - val_loss: 0.4625 Epoch 145/150 110/110 - 0s - 3ms/step - accuracy: 0.8090 - loss: 0.8622 - val_accuracy: 0.7967 - val_loss: 0.4403 Epoch 146/150 110/110 - 1s - 5ms/step - accuracy: 0.8114 - loss: 0.8614 - val_accuracy: 0.8022 - val_loss: 0.4310 Epoch 147/150 110/110 - 1s - 6ms/step - accuracy: 0.8110 - loss: 0.8599 - val_accuracy: 0.7528 - val_loss: 0.4995 Epoch 148/150 110/110 - 0s - 3ms/step - accuracy: 0.8100 - loss: 0.8598 - val_accuracy: 0.7922 - val_loss: 0.4510 Epoch 149/150 110/110 - 1s - 6ms/step - accuracy: 0.8063 - loss: 0.8616 - val_accuracy: 0.7994 - val_loss: 0.4349 Epoch 150/150 110/110 - 1s - 5ms/step - accuracy: 0.8091 - loss: 0.8583 - val_accuracy: 0.7956 - val_loss: 0.4443
#plotting accuracy curve
plot(history4,'accuracy')
#observing loss curve
plot(history4,'loss')
# result observations
results.loc[4] = [2,[128,32],['tanh','tanh'],150,64,"sgd",['-', "-"],"xavier","-",history4.history["loss"][-1],history4.history["val_loss"][-1],history4.history["accuracy"][-1],history4.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
#observing training dataset performance
model_performance_classification(model4, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.814571 | 0.814571 | 0.854701 | 0.826628 |
#Observing validation performance
model_performance_classification(model4, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.795556 | 0.795556 | 0.839876 | 0.808817 |
Observations
- Model accuracy and recall decreased and oscillations increased.
Model 5
We'll start out with a baseline model having the following configuration:
- 1 input, 2 hidden(64,32), 1 output layers
- tanh and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#define model 5
model5 = Sequential()
model5.add(Dense(64,activation="tanh",input_dim=X_train.shape[1]))
model5.add(Dense(32,activation="tanh"))
model5.add(Dense(1,activation="sigmoid"))
#model summary
model5.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#complile model
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model5.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
# start training the model
start = time.time()
history5 = model5.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 64)
Epoch 1/150 110/110 - 1s - 12ms/step - accuracy: 0.6674 - loss: 1.3306 - val_accuracy: 0.6978 - val_loss: 0.6239 Epoch 2/150 110/110 - 1s - 9ms/step - accuracy: 0.7209 - loss: 1.2017 - val_accuracy: 0.7294 - val_loss: 0.5751 Epoch 3/150 110/110 - 1s - 5ms/step - accuracy: 0.7223 - loss: 1.1704 - val_accuracy: 0.7417 - val_loss: 0.5569 Epoch 4/150 110/110 - 1s - 6ms/step - accuracy: 0.7223 - loss: 1.1574 - val_accuracy: 0.7339 - val_loss: 0.5556 Epoch 5/150 110/110 - 0s - 4ms/step - accuracy: 0.7219 - loss: 1.1493 - val_accuracy: 0.7222 - val_loss: 0.5715 Epoch 6/150 110/110 - 0s - 4ms/step - accuracy: 0.7171 - loss: 1.1446 - val_accuracy: 0.7256 - val_loss: 0.5617 Epoch 7/150 110/110 - 0s - 3ms/step - accuracy: 0.7207 - loss: 1.1393 - val_accuracy: 0.7311 - val_loss: 0.5565 Epoch 8/150 110/110 - 1s - 6ms/step - accuracy: 0.7149 - loss: 1.1357 - val_accuracy: 0.7244 - val_loss: 0.5600 Epoch 9/150 110/110 - 1s - 5ms/step - accuracy: 0.7149 - loss: 1.1317 - val_accuracy: 0.7217 - val_loss: 0.5583 Epoch 10/150 110/110 - 1s - 6ms/step - accuracy: 0.7117 - loss: 1.1265 - val_accuracy: 0.7494 - val_loss: 0.5247 Epoch 11/150 110/110 - 1s - 6ms/step - accuracy: 0.7194 - loss: 1.1239 - val_accuracy: 0.7294 - val_loss: 0.5469 Epoch 12/150 110/110 - 0s - 3ms/step - accuracy: 0.7169 - loss: 1.1202 - val_accuracy: 0.7289 - val_loss: 0.5443 Epoch 13/150 110/110 - 1s - 5ms/step - accuracy: 0.7183 - loss: 1.1159 - val_accuracy: 0.7367 - val_loss: 0.5360 Epoch 14/150 110/110 - 0s - 3ms/step - accuracy: 0.7197 - loss: 1.1118 - val_accuracy: 0.7094 - val_loss: 0.5644 Epoch 15/150 110/110 - 1s - 6ms/step - accuracy: 0.7150 - loss: 1.1071 - val_accuracy: 0.7356 - val_loss: 0.5294 Epoch 16/150 110/110 - 0s - 3ms/step - accuracy: 0.7171 - loss: 1.1028 - val_accuracy: 0.7472 - val_loss: 0.5185 Epoch 17/150 110/110 - 0s - 3ms/step - accuracy: 0.7211 - loss: 1.0986 - val_accuracy: 0.7250 - val_loss: 0.5428 Epoch 18/150 110/110 - 1s - 6ms/step - accuracy: 0.7200 - loss: 1.0946 - val_accuracy: 0.7272 - val_loss: 0.5386 Epoch 19/150 110/110 - 1s - 6ms/step - accuracy: 0.7246 - loss: 1.0896 - val_accuracy: 0.7261 - val_loss: 0.5370 Epoch 20/150 110/110 - 1s - 6ms/step - accuracy: 0.7216 - loss: 1.0858 - val_accuracy: 0.7439 - val_loss: 0.5182 Epoch 21/150 110/110 - 0s - 3ms/step - accuracy: 0.7253 - loss: 1.0811 - val_accuracy: 0.7528 - val_loss: 0.5084 Epoch 22/150 110/110 - 0s - 4ms/step - accuracy: 0.7307 - loss: 1.0762 - val_accuracy: 0.7506 - val_loss: 0.5057 Epoch 23/150 110/110 - 1s - 5ms/step - accuracy: 0.7347 - loss: 1.0722 - val_accuracy: 0.7528 - val_loss: 0.5079 Epoch 24/150 110/110 - 1s - 6ms/step - accuracy: 0.7376 - loss: 1.0661 - val_accuracy: 0.6978 - val_loss: 0.5661 Epoch 25/150 110/110 - 0s - 4ms/step - accuracy: 0.7379 - loss: 1.0626 - val_accuracy: 0.7389 - val_loss: 0.5247 Epoch 26/150 110/110 - 1s - 6ms/step - accuracy: 0.7416 - loss: 1.0565 - val_accuracy: 0.7556 - val_loss: 0.5021 Epoch 27/150 110/110 - 1s - 5ms/step - accuracy: 0.7456 - loss: 1.0509 - val_accuracy: 0.7333 - val_loss: 0.5319 Epoch 28/150 110/110 - 1s - 5ms/step - accuracy: 0.7536 - loss: 1.0473 - val_accuracy: 0.6978 - val_loss: 0.5648 Epoch 29/150 110/110 - 1s - 5ms/step - accuracy: 0.7450 - loss: 1.0425 - val_accuracy: 0.7783 - val_loss: 0.4816 Epoch 30/150 110/110 - 1s - 6ms/step - accuracy: 0.7521 - loss: 1.0371 - val_accuracy: 0.7778 - val_loss: 0.4797 Epoch 31/150 110/110 - 1s - 5ms/step - accuracy: 0.7567 - loss: 1.0330 - val_accuracy: 0.7767 - val_loss: 0.4770 Epoch 32/150 110/110 - 0s - 4ms/step - accuracy: 0.7614 - loss: 1.0287 - val_accuracy: 0.7628 - val_loss: 0.4942 Epoch 33/150 110/110 - 1s - 5ms/step - accuracy: 0.7657 - loss: 1.0228 - val_accuracy: 0.7178 - val_loss: 0.5412 Epoch 34/150 110/110 - 0s - 3ms/step - accuracy: 0.7593 - loss: 1.0184 - val_accuracy: 0.7894 - val_loss: 0.4645 Epoch 35/150 110/110 - 1s - 5ms/step - accuracy: 0.7670 - loss: 1.0145 - val_accuracy: 0.7594 - val_loss: 0.4974 Epoch 36/150 110/110 - 0s - 3ms/step - accuracy: 0.7654 - loss: 1.0089 - val_accuracy: 0.7806 - val_loss: 0.4733 Epoch 37/150 110/110 - 1s - 6ms/step - accuracy: 0.7694 - loss: 1.0030 - val_accuracy: 0.7433 - val_loss: 0.5225 Epoch 38/150 110/110 - 1s - 6ms/step - accuracy: 0.7694 - loss: 0.9998 - val_accuracy: 0.7622 - val_loss: 0.4948 Epoch 39/150 110/110 - 0s - 4ms/step - accuracy: 0.7720 - loss: 0.9944 - val_accuracy: 0.7767 - val_loss: 0.4783 Epoch 40/150 110/110 - 1s - 5ms/step - accuracy: 0.7730 - loss: 0.9920 - val_accuracy: 0.7767 - val_loss: 0.4829 Epoch 41/150 110/110 - 0s - 4ms/step - accuracy: 0.7729 - loss: 0.9877 - val_accuracy: 0.7800 - val_loss: 0.4629 Epoch 42/150 110/110 - 0s - 4ms/step - accuracy: 0.7777 - loss: 0.9833 - val_accuracy: 0.7200 - val_loss: 0.5534 Epoch 43/150 110/110 - 1s - 5ms/step - accuracy: 0.7746 - loss: 0.9798 - val_accuracy: 0.7228 - val_loss: 0.5461 Epoch 44/150 110/110 - 0s - 4ms/step - accuracy: 0.7737 - loss: 0.9757 - val_accuracy: 0.7661 - val_loss: 0.4854 Epoch 45/150 110/110 - 1s - 5ms/step - accuracy: 0.7746 - loss: 0.9727 - val_accuracy: 0.7844 - val_loss: 0.4571 Epoch 46/150 110/110 - 0s - 3ms/step - accuracy: 0.7789 - loss: 0.9683 - val_accuracy: 0.7794 - val_loss: 0.4742 Epoch 47/150 110/110 - 1s - 6ms/step - accuracy: 0.7774 - loss: 0.9672 - val_accuracy: 0.7761 - val_loss: 0.4719 Epoch 48/150 110/110 - 0s - 3ms/step - accuracy: 0.7789 - loss: 0.9623 - val_accuracy: 0.7822 - val_loss: 0.4622 Epoch 49/150 110/110 - 0s - 3ms/step - accuracy: 0.7789 - loss: 0.9590 - val_accuracy: 0.7833 - val_loss: 0.4659 Epoch 50/150 110/110 - 0s - 3ms/step - accuracy: 0.7807 - loss: 0.9576 - val_accuracy: 0.7550 - val_loss: 0.5088 Epoch 51/150 110/110 - 0s - 4ms/step - accuracy: 0.7779 - loss: 0.9557 - val_accuracy: 0.7944 - val_loss: 0.4415 Epoch 52/150 110/110 - 1s - 5ms/step - accuracy: 0.7819 - loss: 0.9509 - val_accuracy: 0.7394 - val_loss: 0.5212 Epoch 53/150 110/110 - 1s - 5ms/step - accuracy: 0.7779 - loss: 0.9512 - val_accuracy: 0.7828 - val_loss: 0.4578 Epoch 54/150 110/110 - 1s - 6ms/step - accuracy: 0.7821 - loss: 0.9486 - val_accuracy: 0.7972 - val_loss: 0.4343 Epoch 55/150 110/110 - 1s - 6ms/step - accuracy: 0.7813 - loss: 0.9451 - val_accuracy: 0.7728 - val_loss: 0.4842 Epoch 56/150 110/110 - 1s - 5ms/step - accuracy: 0.7831 - loss: 0.9437 - val_accuracy: 0.7994 - val_loss: 0.4306 Epoch 57/150 110/110 - 1s - 6ms/step - accuracy: 0.7830 - loss: 0.9426 - val_accuracy: 0.7772 - val_loss: 0.4703 Epoch 58/150 110/110 - 0s - 4ms/step - accuracy: 0.7834 - loss: 0.9390 - val_accuracy: 0.7800 - val_loss: 0.4621 Epoch 59/150 110/110 - 0s - 4ms/step - accuracy: 0.7856 - loss: 0.9373 - val_accuracy: 0.7839 - val_loss: 0.4541 Epoch 60/150 110/110 - 1s - 5ms/step - accuracy: 0.7814 - loss: 0.9341 - val_accuracy: 0.7961 - val_loss: 0.4325 Epoch 61/150 110/110 - 0s - 3ms/step - accuracy: 0.7833 - loss: 0.9355 - val_accuracy: 0.7539 - val_loss: 0.5098 Epoch 62/150 110/110 - 1s - 5ms/step - accuracy: 0.7829 - loss: 0.9334 - val_accuracy: 0.7761 - val_loss: 0.4678 Epoch 63/150 110/110 - 0s - 4ms/step - accuracy: 0.7867 - loss: 0.9292 - val_accuracy: 0.7772 - val_loss: 0.4661 Epoch 64/150 110/110 - 0s - 3ms/step - accuracy: 0.7860 - loss: 0.9289 - val_accuracy: 0.7178 - val_loss: 0.5499 Epoch 65/150 110/110 - 1s - 6ms/step - accuracy: 0.7850 - loss: 0.9287 - val_accuracy: 0.7939 - val_loss: 0.4355 Epoch 66/150 110/110 - 1s - 5ms/step - accuracy: 0.7869 - loss: 0.9254 - val_accuracy: 0.7678 - val_loss: 0.4765 Epoch 67/150 110/110 - 1s - 6ms/step - accuracy: 0.7839 - loss: 0.9230 - val_accuracy: 0.7711 - val_loss: 0.4787 Epoch 68/150 110/110 - 0s - 3ms/step - accuracy: 0.7904 - loss: 0.9220 - val_accuracy: 0.7706 - val_loss: 0.4703 Epoch 69/150 110/110 - 1s - 5ms/step - accuracy: 0.7847 - loss: 0.9204 - val_accuracy: 0.7522 - val_loss: 0.5057 Epoch 70/150 110/110 - 1s - 6ms/step - accuracy: 0.7863 - loss: 0.9219 - val_accuracy: 0.7361 - val_loss: 0.5204 Epoch 71/150 110/110 - 0s - 4ms/step - accuracy: 0.7886 - loss: 0.9221 - val_accuracy: 0.7444 - val_loss: 0.5085 Epoch 72/150 110/110 - 1s - 5ms/step - accuracy: 0.7850 - loss: 0.9185 - val_accuracy: 0.7750 - val_loss: 0.4639 Epoch 73/150 110/110 - 0s - 4ms/step - accuracy: 0.7887 - loss: 0.9193 - val_accuracy: 0.7706 - val_loss: 0.4728 Epoch 74/150 110/110 - 1s - 5ms/step - accuracy: 0.7884 - loss: 0.9153 - val_accuracy: 0.7978 - val_loss: 0.4298 Epoch 75/150 110/110 - 1s - 6ms/step - accuracy: 0.7899 - loss: 0.9146 - val_accuracy: 0.7922 - val_loss: 0.4332 Epoch 76/150 110/110 - 0s - 3ms/step - accuracy: 0.7911 - loss: 0.9167 - val_accuracy: 0.7344 - val_loss: 0.5193 Epoch 77/150 110/110 - 1s - 6ms/step - accuracy: 0.7863 - loss: 0.9135 - val_accuracy: 0.7933 - val_loss: 0.4335 Epoch 78/150 110/110 - 0s - 3ms/step - accuracy: 0.7931 - loss: 0.9111 - val_accuracy: 0.7472 - val_loss: 0.5009 Epoch 79/150 110/110 - 1s - 5ms/step - accuracy: 0.7884 - loss: 0.9085 - val_accuracy: 0.7900 - val_loss: 0.4435 Epoch 80/150 110/110 - 1s - 5ms/step - accuracy: 0.7903 - loss: 0.9135 - val_accuracy: 0.7539 - val_loss: 0.4960 Epoch 81/150 110/110 - 1s - 6ms/step - accuracy: 0.7913 - loss: 0.9118 - val_accuracy: 0.7822 - val_loss: 0.4526 Epoch 82/150 110/110 - 1s - 6ms/step - accuracy: 0.7923 - loss: 0.9085 - val_accuracy: 0.8094 - val_loss: 0.4107 Epoch 83/150 110/110 - 1s - 5ms/step - accuracy: 0.7954 - loss: 0.9084 - val_accuracy: 0.7733 - val_loss: 0.4682 Epoch 84/150 110/110 - 0s - 5ms/step - accuracy: 0.7931 - loss: 0.9055 - val_accuracy: 0.7783 - val_loss: 0.4532 Epoch 85/150 110/110 - 1s - 5ms/step - accuracy: 0.7893 - loss: 0.9063 - val_accuracy: 0.7994 - val_loss: 0.4277 Epoch 86/150 110/110 - 1s - 5ms/step - accuracy: 0.7936 - loss: 0.9060 - val_accuracy: 0.7544 - val_loss: 0.4949 Epoch 87/150 110/110 - 0s - 3ms/step - accuracy: 0.7909 - loss: 0.9023 - val_accuracy: 0.7494 - val_loss: 0.5051 Epoch 88/150 110/110 - 1s - 6ms/step - accuracy: 0.7897 - loss: 0.9043 - val_accuracy: 0.7922 - val_loss: 0.4319 Epoch 89/150 110/110 - 0s - 3ms/step - accuracy: 0.7906 - loss: 0.9041 - val_accuracy: 0.8183 - val_loss: 0.3987 Epoch 90/150 110/110 - 1s - 6ms/step - accuracy: 0.7927 - loss: 0.9031 - val_accuracy: 0.8039 - val_loss: 0.4202 Epoch 91/150 110/110 - 0s - 3ms/step - accuracy: 0.7954 - loss: 0.9028 - val_accuracy: 0.7861 - val_loss: 0.4440 Epoch 92/150 110/110 - 1s - 6ms/step - accuracy: 0.7964 - loss: 0.9001 - val_accuracy: 0.8239 - val_loss: 0.3978 Epoch 93/150 110/110 - 1s - 6ms/step - accuracy: 0.7949 - loss: 0.9015 - val_accuracy: 0.7628 - val_loss: 0.4782 Epoch 94/150 110/110 - 0s - 3ms/step - accuracy: 0.7933 - loss: 0.9003 - val_accuracy: 0.7983 - val_loss: 0.4343 Epoch 95/150 110/110 - 1s - 5ms/step - accuracy: 0.7960 - loss: 0.8986 - val_accuracy: 0.7533 - val_loss: 0.4883 Epoch 96/150 110/110 - 0s - 3ms/step - accuracy: 0.7959 - loss: 0.8994 - val_accuracy: 0.7706 - val_loss: 0.4672 Epoch 97/150 110/110 - 1s - 5ms/step - accuracy: 0.7961 - loss: 0.8974 - val_accuracy: 0.7206 - val_loss: 0.5384 Epoch 98/150 110/110 - 1s - 6ms/step - accuracy: 0.7971 - loss: 0.8966 - val_accuracy: 0.7417 - val_loss: 0.5175 Epoch 99/150 110/110 - 1s - 6ms/step - accuracy: 0.7963 - loss: 0.8974 - val_accuracy: 0.7744 - val_loss: 0.4600 Epoch 100/150 110/110 - 0s - 3ms/step - accuracy: 0.7979 - loss: 0.8969 - val_accuracy: 0.6800 - val_loss: 0.6032 Epoch 101/150 110/110 - 1s - 5ms/step - accuracy: 0.7941 - loss: 0.8960 - val_accuracy: 0.8106 - val_loss: 0.4117 Epoch 102/150 110/110 - 1s - 6ms/step - accuracy: 0.7997 - loss: 0.8946 - val_accuracy: 0.7872 - val_loss: 0.4426 Epoch 103/150 110/110 - 0s - 3ms/step - accuracy: 0.8003 - loss: 0.8932 - val_accuracy: 0.7789 - val_loss: 0.4474 Epoch 104/150 110/110 - 1s - 8ms/step - accuracy: 0.7997 - loss: 0.8918 - val_accuracy: 0.8067 - val_loss: 0.4160 Epoch 105/150 110/110 - 1s - 6ms/step - accuracy: 0.7986 - loss: 0.8908 - val_accuracy: 0.7244 - val_loss: 0.5386 Epoch 106/150 110/110 - 1s - 10ms/step - accuracy: 0.7966 - loss: 0.8910 - val_accuracy: 0.7983 - val_loss: 0.4277 Epoch 107/150 110/110 - 1s - 7ms/step - accuracy: 0.7977 - loss: 0.8931 - val_accuracy: 0.7928 - val_loss: 0.4325 Epoch 108/150 110/110 - 1s - 5ms/step - accuracy: 0.8017 - loss: 0.8877 - val_accuracy: 0.7467 - val_loss: 0.5030 Epoch 109/150 110/110 - 0s - 4ms/step - accuracy: 0.7983 - loss: 0.8882 - val_accuracy: 0.7933 - val_loss: 0.4347 Epoch 110/150 110/110 - 1s - 5ms/step - accuracy: 0.8006 - loss: 0.8873 - val_accuracy: 0.8189 - val_loss: 0.4077 Epoch 111/150 110/110 - 1s - 6ms/step - accuracy: 0.7989 - loss: 0.8880 - val_accuracy: 0.7494 - val_loss: 0.5037 Epoch 112/150 110/110 - 1s - 5ms/step - accuracy: 0.7999 - loss: 0.8858 - val_accuracy: 0.7744 - val_loss: 0.4591 Epoch 113/150 110/110 - 1s - 6ms/step - accuracy: 0.8010 - loss: 0.8891 - val_accuracy: 0.7544 - val_loss: 0.4930 Epoch 114/150 110/110 - 1s - 6ms/step - accuracy: 0.7996 - loss: 0.8869 - val_accuracy: 0.8122 - val_loss: 0.4117 Epoch 115/150 110/110 - 1s - 5ms/step - accuracy: 0.7990 - loss: 0.8887 - val_accuracy: 0.7417 - val_loss: 0.5030 Epoch 116/150 110/110 - 0s - 4ms/step - accuracy: 0.7999 - loss: 0.8848 - val_accuracy: 0.7994 - val_loss: 0.4218 Epoch 117/150 110/110 - 1s - 5ms/step - accuracy: 0.7986 - loss: 0.8849 - val_accuracy: 0.7978 - val_loss: 0.4205 Epoch 118/150 110/110 - 0s - 3ms/step - accuracy: 0.8017 - loss: 0.8850 - val_accuracy: 0.7878 - val_loss: 0.4433 Epoch 119/150 110/110 - 0s - 3ms/step - accuracy: 0.8004 - loss: 0.8858 - val_accuracy: 0.8100 - val_loss: 0.4121 Epoch 120/150 110/110 - 0s - 3ms/step - accuracy: 0.8026 - loss: 0.8898 - val_accuracy: 0.7550 - val_loss: 0.4825 Epoch 121/150 110/110 - 1s - 6ms/step - accuracy: 0.8029 - loss: 0.8829 - val_accuracy: 0.7483 - val_loss: 0.4945 Epoch 122/150 110/110 - 1s - 6ms/step - accuracy: 0.8021 - loss: 0.8820 - val_accuracy: 0.8461 - val_loss: 0.3687 Epoch 123/150 110/110 - 1s - 6ms/step - accuracy: 0.8043 - loss: 0.8795 - val_accuracy: 0.7606 - val_loss: 0.4822 Epoch 124/150 110/110 - 1s - 5ms/step - accuracy: 0.8030 - loss: 0.8833 - val_accuracy: 0.7667 - val_loss: 0.4670 Epoch 125/150 110/110 - 1s - 6ms/step - accuracy: 0.8017 - loss: 0.8834 - val_accuracy: 0.7928 - val_loss: 0.4276 Epoch 126/150 110/110 - 1s - 6ms/step - accuracy: 0.8026 - loss: 0.8802 - val_accuracy: 0.7900 - val_loss: 0.4261 Epoch 127/150 110/110 - 0s - 4ms/step - accuracy: 0.8036 - loss: 0.8798 - val_accuracy: 0.7728 - val_loss: 0.4638 Epoch 128/150 110/110 - 1s - 7ms/step - accuracy: 0.8037 - loss: 0.8788 - val_accuracy: 0.8150 - val_loss: 0.4040 Epoch 129/150 110/110 - 1s - 5ms/step - accuracy: 0.8029 - loss: 0.8801 - val_accuracy: 0.7756 - val_loss: 0.4595 Epoch 130/150 110/110 - 1s - 6ms/step - accuracy: 0.8054 - loss: 0.8775 - val_accuracy: 0.7383 - val_loss: 0.5183 Epoch 131/150 110/110 - 1s - 6ms/step - accuracy: 0.8024 - loss: 0.8771 - val_accuracy: 0.8211 - val_loss: 0.3996 Epoch 132/150 110/110 - 1s - 6ms/step - accuracy: 0.8057 - loss: 0.8730 - val_accuracy: 0.7539 - val_loss: 0.4932 Epoch 133/150 110/110 - 0s - 4ms/step - accuracy: 0.8049 - loss: 0.8750 - val_accuracy: 0.7867 - val_loss: 0.4553 Epoch 134/150 110/110 - 0s - 3ms/step - accuracy: 0.8049 - loss: 0.8737 - val_accuracy: 0.7767 - val_loss: 0.4531 Epoch 135/150 110/110 - 1s - 5ms/step - accuracy: 0.8079 - loss: 0.8746 - val_accuracy: 0.7617 - val_loss: 0.4774 Epoch 136/150 110/110 - 0s - 3ms/step - accuracy: 0.8034 - loss: 0.8746 - val_accuracy: 0.7278 - val_loss: 0.5300 Epoch 137/150 110/110 - 1s - 6ms/step - accuracy: 0.8021 - loss: 0.8770 - val_accuracy: 0.7567 - val_loss: 0.4871 Epoch 138/150 110/110 - 1s - 6ms/step - accuracy: 0.8030 - loss: 0.8747 - val_accuracy: 0.7606 - val_loss: 0.4748 Epoch 139/150 110/110 - 1s - 6ms/step - accuracy: 0.8029 - loss: 0.8733 - val_accuracy: 0.7883 - val_loss: 0.4433 Epoch 140/150 110/110 - 1s - 6ms/step - accuracy: 0.8067 - loss: 0.8719 - val_accuracy: 0.7611 - val_loss: 0.4784 Epoch 141/150 110/110 - 0s - 3ms/step - accuracy: 0.8050 - loss: 0.8747 - val_accuracy: 0.8144 - val_loss: 0.4075 Epoch 142/150 110/110 - 0s - 3ms/step - accuracy: 0.8083 - loss: 0.8722 - val_accuracy: 0.7983 - val_loss: 0.4300 Epoch 143/150 110/110 - 1s - 5ms/step - accuracy: 0.8047 - loss: 0.8710 - val_accuracy: 0.7567 - val_loss: 0.4914 Epoch 144/150 110/110 - 1s - 6ms/step - accuracy: 0.8063 - loss: 0.8712 - val_accuracy: 0.7911 - val_loss: 0.4451 Epoch 145/150 110/110 - 0s - 3ms/step - accuracy: 0.8060 - loss: 0.8690 - val_accuracy: 0.8283 - val_loss: 0.3879 Epoch 146/150 110/110 - 1s - 6ms/step - accuracy: 0.8047 - loss: 0.8697 - val_accuracy: 0.7972 - val_loss: 0.4251 Epoch 147/150 110/110 - 1s - 6ms/step - accuracy: 0.8081 - loss: 0.8710 - val_accuracy: 0.7950 - val_loss: 0.4286 Epoch 148/150 110/110 - 0s - 3ms/step - accuracy: 0.8083 - loss: 0.8674 - val_accuracy: 0.8239 - val_loss: 0.3989 Epoch 149/150 110/110 - 1s - 5ms/step - accuracy: 0.8074 - loss: 0.8645 - val_accuracy: 0.7989 - val_loss: 0.4280 Epoch 150/150 110/110 - 0s - 3ms/step - accuracy: 0.8069 - loss: 0.8665 - val_accuracy: 0.8172 - val_loss: 0.4052
#Observing accuracy curve
plot(history5,'accuracy')
#Observing Loss curve
plot(history5,'loss')
#Observing all model paramerters
results.loc[5] = [2,[64,32],['tanh','tanh'],150,64,"sgd",['-', "-"],"xavier","-",history5.history["loss"][-1],history5.history["val_loss"][-1],history5.history["accuracy"][-1],history5.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
#Observing model performace for training data
model_performance_classification(model5, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.831143 | 0.831143 | 0.854051 | 0.839051 |
#Observing model performance for validation data
model_performance_classification(model5, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.817222 | 0.817222 | 0.841289 | 0.82558 |
Observations
- Improvement in Accuracy,recall and F1_score.
- But,This model is not performing that well as compared to model 3.
Model 6
- Let's use model 3 with changing learning rate and adding momemtum
- 1 input, 2 hidden, 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
- Learning rate 0.001 and momentum 0.3
- batch size=128
# clears the current Keras session, resetting all layers and models previously created, freeing up memory and resources.
tf.keras.backend.clear_session()
#define model
model6 = Sequential()
model6.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model6.add(Dense(32,activation="tanh"))
model6.add(Dense(1,activation="sigmoid"))
# model summary
model6.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#Compile model
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001, momentum=0.3) # defining SGD as the optimizer to be used
model6.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
# start training the model
start = time.time()
history6 = model6.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 128)
Epoch 1/150 55/55 - 2s - 32ms/step - accuracy: 0.3093 - loss: 1.3995 - val_accuracy: 0.3778 - val_loss: 0.7793 Epoch 2/150 55/55 - 1s - 15ms/step - accuracy: 0.4064 - loss: 1.3517 - val_accuracy: 0.4583 - val_loss: 0.7262 Epoch 3/150 55/55 - 0s - 4ms/step - accuracy: 0.4903 - loss: 1.3253 - val_accuracy: 0.5272 - val_loss: 0.6946 Epoch 4/150 55/55 - 0s - 5ms/step - accuracy: 0.5429 - loss: 1.3075 - val_accuracy: 0.5733 - val_loss: 0.6733 Epoch 5/150 55/55 - 0s - 5ms/step - accuracy: 0.5811 - loss: 1.2939 - val_accuracy: 0.5956 - val_loss: 0.6604 Epoch 6/150 55/55 - 0s - 5ms/step - accuracy: 0.5983 - loss: 1.2825 - val_accuracy: 0.6094 - val_loss: 0.6499 Epoch 7/150 55/55 - 0s - 6ms/step - accuracy: 0.6227 - loss: 1.2726 - val_accuracy: 0.6294 - val_loss: 0.6418 Epoch 8/150 55/55 - 0s - 5ms/step - accuracy: 0.6349 - loss: 1.2637 - val_accuracy: 0.6400 - val_loss: 0.6354 Epoch 9/150 55/55 - 0s - 6ms/step - accuracy: 0.6470 - loss: 1.2559 - val_accuracy: 0.6472 - val_loss: 0.6308 Epoch 10/150 55/55 - 0s - 5ms/step - accuracy: 0.6541 - loss: 1.2487 - val_accuracy: 0.6550 - val_loss: 0.6266 Epoch 11/150 55/55 - 0s - 5ms/step - accuracy: 0.6597 - loss: 1.2422 - val_accuracy: 0.6594 - val_loss: 0.6225 Epoch 12/150 55/55 - 0s - 6ms/step - accuracy: 0.6719 - loss: 1.2363 - val_accuracy: 0.6678 - val_loss: 0.6204 Epoch 13/150 55/55 - 0s - 5ms/step - accuracy: 0.6717 - loss: 1.2309 - val_accuracy: 0.6711 - val_loss: 0.6169 Epoch 14/150 55/55 - 0s - 4ms/step - accuracy: 0.6760 - loss: 1.2258 - val_accuracy: 0.6761 - val_loss: 0.6141 Epoch 15/150 55/55 - 0s - 5ms/step - accuracy: 0.6830 - loss: 1.2212 - val_accuracy: 0.6828 - val_loss: 0.6125 Epoch 16/150 55/55 - 0s - 6ms/step - accuracy: 0.6821 - loss: 1.2169 - val_accuracy: 0.6844 - val_loss: 0.6096 Epoch 17/150 55/55 - 0s - 4ms/step - accuracy: 0.6864 - loss: 1.2129 - val_accuracy: 0.6894 - val_loss: 0.6073 Epoch 18/150 55/55 - 0s - 4ms/step - accuracy: 0.6883 - loss: 1.2092 - val_accuracy: 0.6917 - val_loss: 0.6048 Epoch 19/150 55/55 - 0s - 6ms/step - accuracy: 0.6929 - loss: 1.2056 - val_accuracy: 0.6939 - val_loss: 0.6036 Epoch 20/150 55/55 - 0s - 5ms/step - accuracy: 0.6939 - loss: 1.2024 - val_accuracy: 0.6967 - val_loss: 0.6018 Epoch 21/150 55/55 - 0s - 6ms/step - accuracy: 0.6924 - loss: 1.1992 - val_accuracy: 0.7006 - val_loss: 0.5989 Epoch 22/150 55/55 - 0s - 5ms/step - accuracy: 0.7001 - loss: 1.1963 - val_accuracy: 0.7022 - val_loss: 0.5978 Epoch 23/150 55/55 - 0s - 5ms/step - accuracy: 0.7001 - loss: 1.1935 - val_accuracy: 0.7033 - val_loss: 0.5962 Epoch 24/150 55/55 - 0s - 4ms/step - accuracy: 0.7000 - loss: 1.1908 - val_accuracy: 0.7067 - val_loss: 0.5939 Epoch 25/150 55/55 - 0s - 4ms/step - accuracy: 0.7049 - loss: 1.1883 - val_accuracy: 0.7078 - val_loss: 0.5934 Epoch 26/150 55/55 - 0s - 4ms/step - accuracy: 0.7009 - loss: 1.1860 - val_accuracy: 0.7106 - val_loss: 0.5911 Epoch 27/150 55/55 - 0s - 4ms/step - accuracy: 0.7074 - loss: 1.1837 - val_accuracy: 0.7111 - val_loss: 0.5906 Epoch 28/150 55/55 - 0s - 4ms/step - accuracy: 0.7070 - loss: 1.1815 - val_accuracy: 0.7111 - val_loss: 0.5895 Epoch 29/150 55/55 - 0s - 6ms/step - accuracy: 0.7064 - loss: 1.1795 - val_accuracy: 0.7128 - val_loss: 0.5875 Epoch 30/150 55/55 - 0s - 5ms/step - accuracy: 0.7097 - loss: 1.1774 - val_accuracy: 0.7161 - val_loss: 0.5857 Epoch 31/150 55/55 - 0s - 4ms/step - accuracy: 0.7141 - loss: 1.1755 - val_accuracy: 0.7167 - val_loss: 0.5860 Epoch 32/150 55/55 - 0s - 4ms/step - accuracy: 0.7089 - loss: 1.1737 - val_accuracy: 0.7167 - val_loss: 0.5844 Epoch 33/150 55/55 - 0s - 6ms/step - accuracy: 0.7141 - loss: 1.1719 - val_accuracy: 0.7156 - val_loss: 0.5839 Epoch 34/150 55/55 - 0s - 5ms/step - accuracy: 0.7113 - loss: 1.1702 - val_accuracy: 0.7167 - val_loss: 0.5826 Epoch 35/150 55/55 - 0s - 5ms/step - accuracy: 0.7130 - loss: 1.1686 - val_accuracy: 0.7178 - val_loss: 0.5810 Epoch 36/150 55/55 - 0s - 4ms/step - accuracy: 0.7134 - loss: 1.1670 - val_accuracy: 0.7183 - val_loss: 0.5799 Epoch 37/150 55/55 - 0s - 7ms/step - accuracy: 0.7147 - loss: 1.1655 - val_accuracy: 0.7172 - val_loss: 0.5796 Epoch 38/150 55/55 - 1s - 11ms/step - accuracy: 0.7160 - loss: 1.1640 - val_accuracy: 0.7178 - val_loss: 0.5793 Epoch 39/150 55/55 - 1s - 11ms/step - accuracy: 0.7159 - loss: 1.1625 - val_accuracy: 0.7189 - val_loss: 0.5777 Epoch 40/150 55/55 - 1s - 12ms/step - accuracy: 0.7174 - loss: 1.1611 - val_accuracy: 0.7172 - val_loss: 0.5779 Epoch 41/150 55/55 - 1s - 12ms/step - accuracy: 0.7151 - loss: 1.1597 - val_accuracy: 0.7189 - val_loss: 0.5763 Epoch 42/150 55/55 - 0s - 5ms/step - accuracy: 0.7163 - loss: 1.1584 - val_accuracy: 0.7222 - val_loss: 0.5744 Epoch 43/150 55/55 - 0s - 4ms/step - accuracy: 0.7196 - loss: 1.1571 - val_accuracy: 0.7233 - val_loss: 0.5737 Epoch 44/150 55/55 - 0s - 5ms/step - accuracy: 0.7197 - loss: 1.1559 - val_accuracy: 0.7228 - val_loss: 0.5730 Epoch 45/150 55/55 - 0s - 5ms/step - accuracy: 0.7193 - loss: 1.1547 - val_accuracy: 0.7222 - val_loss: 0.5732 Epoch 46/150 55/55 - 0s - 4ms/step - accuracy: 0.7196 - loss: 1.1535 - val_accuracy: 0.7228 - val_loss: 0.5726 Epoch 47/150 55/55 - 0s - 5ms/step - accuracy: 0.7170 - loss: 1.1523 - val_accuracy: 0.7228 - val_loss: 0.5706 Epoch 48/150 55/55 - 0s - 6ms/step - accuracy: 0.7177 - loss: 1.1511 - val_accuracy: 0.7222 - val_loss: 0.5698 Epoch 49/150 55/55 - 0s - 4ms/step - accuracy: 0.7176 - loss: 1.1501 - val_accuracy: 0.7233 - val_loss: 0.5691 Epoch 50/150 55/55 - 0s - 6ms/step - accuracy: 0.7179 - loss: 1.1490 - val_accuracy: 0.7244 - val_loss: 0.5685 Epoch 51/150 55/55 - 0s - 6ms/step - accuracy: 0.7170 - loss: 1.1480 - val_accuracy: 0.7244 - val_loss: 0.5680 Epoch 52/150 55/55 - 0s - 4ms/step - accuracy: 0.7166 - loss: 1.1469 - val_accuracy: 0.7239 - val_loss: 0.5671 Epoch 53/150 55/55 - 0s - 4ms/step - accuracy: 0.7166 - loss: 1.1458 - val_accuracy: 0.7239 - val_loss: 0.5659 Epoch 54/150 55/55 - 0s - 6ms/step - accuracy: 0.7176 - loss: 1.1448 - val_accuracy: 0.7239 - val_loss: 0.5652 Epoch 55/150 55/55 - 0s - 5ms/step - accuracy: 0.7187 - loss: 1.1439 - val_accuracy: 0.7244 - val_loss: 0.5653 Epoch 56/150 55/55 - 0s - 5ms/step - accuracy: 0.7174 - loss: 1.1429 - val_accuracy: 0.7239 - val_loss: 0.5647 Epoch 57/150 55/55 - 0s - 5ms/step - accuracy: 0.7177 - loss: 1.1419 - val_accuracy: 0.7228 - val_loss: 0.5644 Epoch 58/150 55/55 - 0s - 4ms/step - accuracy: 0.7199 - loss: 1.1410 - val_accuracy: 0.7222 - val_loss: 0.5650 Epoch 59/150 55/55 - 0s - 6ms/step - accuracy: 0.7167 - loss: 1.1402 - val_accuracy: 0.7217 - val_loss: 0.5636 Epoch 60/150 55/55 - 0s - 6ms/step - accuracy: 0.7173 - loss: 1.1393 - val_accuracy: 0.7228 - val_loss: 0.5624 Epoch 61/150 55/55 - 0s - 5ms/step - accuracy: 0.7173 - loss: 1.1384 - val_accuracy: 0.7228 - val_loss: 0.5611 Epoch 62/150 55/55 - 0s - 5ms/step - accuracy: 0.7179 - loss: 1.1375 - val_accuracy: 0.7228 - val_loss: 0.5607 Epoch 63/150 55/55 - 0s - 5ms/step - accuracy: 0.7203 - loss: 1.1367 - val_accuracy: 0.7211 - val_loss: 0.5613 Epoch 64/150 55/55 - 0s - 6ms/step - accuracy: 0.7176 - loss: 1.1360 - val_accuracy: 0.7217 - val_loss: 0.5605 Epoch 65/150 55/55 - 0s - 5ms/step - accuracy: 0.7177 - loss: 1.1351 - val_accuracy: 0.7222 - val_loss: 0.5592 Epoch 66/150 55/55 - 0s - 5ms/step - accuracy: 0.7193 - loss: 1.1344 - val_accuracy: 0.7222 - val_loss: 0.5588 Epoch 67/150 55/55 - 0s - 4ms/step - accuracy: 0.7186 - loss: 1.1336 - val_accuracy: 0.7239 - val_loss: 0.5581 Epoch 68/150 55/55 - 0s - 4ms/step - accuracy: 0.7189 - loss: 1.1328 - val_accuracy: 0.7228 - val_loss: 0.5581 Epoch 69/150 55/55 - 0s - 4ms/step - accuracy: 0.7201 - loss: 1.1321 - val_accuracy: 0.7222 - val_loss: 0.5584 Epoch 70/150 55/55 - 0s - 6ms/step - accuracy: 0.7194 - loss: 1.1314 - val_accuracy: 0.7211 - val_loss: 0.5580 Epoch 71/150 55/55 - 0s - 6ms/step - accuracy: 0.7197 - loss: 1.1307 - val_accuracy: 0.7211 - val_loss: 0.5576 Epoch 72/150 55/55 - 0s - 4ms/step - accuracy: 0.7164 - loss: 1.1300 - val_accuracy: 0.7228 - val_loss: 0.5561 Epoch 73/150 55/55 - 0s - 5ms/step - accuracy: 0.7177 - loss: 1.1293 - val_accuracy: 0.7217 - val_loss: 0.5552 Epoch 74/150 55/55 - 0s - 6ms/step - accuracy: 0.7190 - loss: 1.1286 - val_accuracy: 0.7222 - val_loss: 0.5556 Epoch 75/150 55/55 - 0s - 4ms/step - accuracy: 0.7187 - loss: 1.1280 - val_accuracy: 0.7222 - val_loss: 0.5556 Epoch 76/150 55/55 - 0s - 5ms/step - accuracy: 0.7177 - loss: 1.1273 - val_accuracy: 0.7233 - val_loss: 0.5546 Epoch 77/150 55/55 - 0s - 5ms/step - accuracy: 0.7183 - loss: 1.1267 - val_accuracy: 0.7233 - val_loss: 0.5542 Epoch 78/150 55/55 - 0s - 7ms/step - accuracy: 0.7183 - loss: 1.1260 - val_accuracy: 0.7233 - val_loss: 0.5533 Epoch 79/150 55/55 - 1s - 10ms/step - accuracy: 0.7201 - loss: 1.1254 - val_accuracy: 0.7217 - val_loss: 0.5541 Epoch 80/150 55/55 - 1s - 11ms/step - accuracy: 0.7186 - loss: 1.1248 - val_accuracy: 0.7217 - val_loss: 0.5534 Epoch 81/150 55/55 - 1s - 13ms/step - accuracy: 0.7173 - loss: 1.1241 - val_accuracy: 0.7228 - val_loss: 0.5521 Epoch 82/150 55/55 - 1s - 10ms/step - accuracy: 0.7203 - loss: 1.1236 - val_accuracy: 0.7228 - val_loss: 0.5524 Epoch 83/150 55/55 - 1s - 9ms/step - accuracy: 0.7184 - loss: 1.1229 - val_accuracy: 0.7233 - val_loss: 0.5515 Epoch 84/150 55/55 - 0s - 4ms/step - accuracy: 0.7201 - loss: 1.1223 - val_accuracy: 0.7233 - val_loss: 0.5518 Epoch 85/150 55/55 - 0s - 5ms/step - accuracy: 0.7193 - loss: 1.1217 - val_accuracy: 0.7228 - val_loss: 0.5516 Epoch 86/150 55/55 - 0s - 6ms/step - accuracy: 0.7187 - loss: 1.1212 - val_accuracy: 0.7239 - val_loss: 0.5514 Epoch 87/150 55/55 - 0s - 5ms/step - accuracy: 0.7190 - loss: 1.1205 - val_accuracy: 0.7239 - val_loss: 0.5509 Epoch 88/150 55/55 - 0s - 5ms/step - accuracy: 0.7207 - loss: 1.1200 - val_accuracy: 0.7244 - val_loss: 0.5512 Epoch 89/150 55/55 - 0s - 5ms/step - accuracy: 0.7187 - loss: 1.1195 - val_accuracy: 0.7239 - val_loss: 0.5508 Epoch 90/150 55/55 - 0s - 4ms/step - accuracy: 0.7180 - loss: 1.1189 - val_accuracy: 0.7244 - val_loss: 0.5499 Epoch 91/150 55/55 - 0s - 4ms/step - accuracy: 0.7209 - loss: 1.1183 - val_accuracy: 0.7233 - val_loss: 0.5511 Epoch 92/150 55/55 - 0s - 5ms/step - accuracy: 0.7204 - loss: 1.1177 - val_accuracy: 0.7222 - val_loss: 0.5513 Epoch 93/150 55/55 - 0s - 6ms/step - accuracy: 0.7193 - loss: 1.1173 - val_accuracy: 0.7211 - val_loss: 0.5510 Epoch 94/150 55/55 - 0s - 5ms/step - accuracy: 0.7181 - loss: 1.1167 - val_accuracy: 0.7228 - val_loss: 0.5501 Epoch 95/150 55/55 - 0s - 5ms/step - accuracy: 0.7190 - loss: 1.1161 - val_accuracy: 0.7233 - val_loss: 0.5496 Epoch 96/150 55/55 - 0s - 4ms/step - accuracy: 0.7183 - loss: 1.1157 - val_accuracy: 0.7233 - val_loss: 0.5483 Epoch 97/150 55/55 - 0s - 5ms/step - accuracy: 0.7206 - loss: 1.1151 - val_accuracy: 0.7233 - val_loss: 0.5486 Epoch 98/150 55/55 - 0s - 6ms/step - accuracy: 0.7200 - loss: 1.1146 - val_accuracy: 0.7233 - val_loss: 0.5484 Epoch 99/150 55/55 - 0s - 5ms/step - accuracy: 0.7196 - loss: 1.1141 - val_accuracy: 0.7233 - val_loss: 0.5482 Epoch 100/150 55/55 - 0s - 4ms/step - accuracy: 0.7201 - loss: 1.1135 - val_accuracy: 0.7233 - val_loss: 0.5488 Epoch 101/150 55/55 - 0s - 6ms/step - accuracy: 0.7177 - loss: 1.1130 - val_accuracy: 0.7228 - val_loss: 0.5476 Epoch 102/150 55/55 - 0s - 4ms/step - accuracy: 0.7180 - loss: 1.1125 - val_accuracy: 0.7222 - val_loss: 0.5467 Epoch 103/150 55/55 - 0s - 5ms/step - accuracy: 0.7219 - loss: 1.1120 - val_accuracy: 0.7228 - val_loss: 0.5481 Epoch 104/150 55/55 - 0s - 4ms/step - accuracy: 0.7179 - loss: 1.1116 - val_accuracy: 0.7228 - val_loss: 0.5471 Epoch 105/150 55/55 - 0s - 5ms/step - accuracy: 0.7201 - loss: 1.1111 - val_accuracy: 0.7228 - val_loss: 0.5470 Epoch 106/150 55/55 - 0s - 6ms/step - accuracy: 0.7189 - loss: 1.1106 - val_accuracy: 0.7239 - val_loss: 0.5461 Epoch 107/150 55/55 - 1s - 9ms/step - accuracy: 0.7197 - loss: 1.1101 - val_accuracy: 0.7233 - val_loss: 0.5457 Epoch 108/150 55/55 - 0s - 9ms/step - accuracy: 0.7220 - loss: 1.1096 - val_accuracy: 0.7222 - val_loss: 0.5470 Epoch 109/150 55/55 - 1s - 10ms/step - accuracy: 0.7181 - loss: 1.1092 - val_accuracy: 0.7228 - val_loss: 0.5466 Epoch 110/150 55/55 - 0s - 9ms/step - accuracy: 0.7193 - loss: 1.1087 - val_accuracy: 0.7228 - val_loss: 0.5457 Epoch 111/150 55/55 - 0s - 7ms/step - accuracy: 0.7191 - loss: 1.1082 - val_accuracy: 0.7233 - val_loss: 0.5449 Epoch 112/150 55/55 - 1s - 12ms/step - accuracy: 0.7206 - loss: 1.1078 - val_accuracy: 0.7228 - val_loss: 0.5448 Epoch 113/150 55/55 - 1s - 13ms/step - accuracy: 0.7199 - loss: 1.1073 - val_accuracy: 0.7239 - val_loss: 0.5442 Epoch 114/150 55/55 - 0s - 9ms/step - accuracy: 0.7201 - loss: 1.1068 - val_accuracy: 0.7228 - val_loss: 0.5450 Epoch 115/150 55/55 - 1s - 16ms/step - accuracy: 0.7209 - loss: 1.1064 - val_accuracy: 0.7211 - val_loss: 0.5460 Epoch 116/150 55/55 - 2s - 33ms/step - accuracy: 0.7194 - loss: 1.1059 - val_accuracy: 0.7211 - val_loss: 0.5457 Epoch 117/150 55/55 - 2s - 30ms/step - accuracy: 0.7184 - loss: 1.1055 - val_accuracy: 0.7217 - val_loss: 0.5448 Epoch 118/150 55/55 - 0s - 5ms/step - accuracy: 0.7176 - loss: 1.1050 - val_accuracy: 0.7228 - val_loss: 0.5431 Epoch 119/150 55/55 - 0s - 7ms/step - accuracy: 0.7206 - loss: 1.1045 - val_accuracy: 0.7217 - val_loss: 0.5440 Epoch 120/150 55/55 - 0s - 5ms/step - accuracy: 0.7181 - loss: 1.1041 - val_accuracy: 0.7222 - val_loss: 0.5435 Epoch 121/150 55/55 - 0s - 5ms/step - accuracy: 0.7190 - loss: 1.1037 - val_accuracy: 0.7233 - val_loss: 0.5432 Epoch 122/150 55/55 - 0s - 6ms/step - accuracy: 0.7184 - loss: 1.1032 - val_accuracy: 0.7233 - val_loss: 0.5429 Epoch 123/150 55/55 - 0s - 8ms/step - accuracy: 0.7201 - loss: 1.1028 - val_accuracy: 0.7233 - val_loss: 0.5436 Epoch 124/150 55/55 - 1s - 15ms/step - accuracy: 0.7197 - loss: 1.1023 - val_accuracy: 0.7228 - val_loss: 0.5440 Epoch 125/150 55/55 - 0s - 5ms/step - accuracy: 0.7171 - loss: 1.1019 - val_accuracy: 0.7233 - val_loss: 0.5417 Epoch 126/150 55/55 - 0s - 5ms/step - accuracy: 0.7214 - loss: 1.1014 - val_accuracy: 0.7233 - val_loss: 0.5429 Epoch 127/150 55/55 - 0s - 5ms/step - accuracy: 0.7189 - loss: 1.1010 - val_accuracy: 0.7233 - val_loss: 0.5423 Epoch 128/150 55/55 - 1s - 14ms/step - accuracy: 0.7197 - loss: 1.1006 - val_accuracy: 0.7233 - val_loss: 0.5427 Epoch 129/150 55/55 - 0s - 8ms/step - accuracy: 0.7190 - loss: 1.1002 - val_accuracy: 0.7233 - val_loss: 0.5427 Epoch 130/150 55/55 - 1s - 11ms/step - accuracy: 0.7190 - loss: 1.0997 - val_accuracy: 0.7233 - val_loss: 0.5413 Epoch 131/150 55/55 - 1s - 13ms/step - accuracy: 0.7199 - loss: 1.0993 - val_accuracy: 0.7228 - val_loss: 0.5414 Epoch 132/150 55/55 - 1s - 10ms/step - accuracy: 0.7194 - loss: 1.0989 - val_accuracy: 0.7228 - val_loss: 0.5414 Epoch 133/150 55/55 - 1s - 13ms/step - accuracy: 0.7203 - loss: 1.0984 - val_accuracy: 0.7228 - val_loss: 0.5423 Epoch 134/150 55/55 - 1s - 9ms/step - accuracy: 0.7190 - loss: 1.0980 - val_accuracy: 0.7228 - val_loss: 0.5418 Epoch 135/150 55/55 - 1s - 11ms/step - accuracy: 0.7177 - loss: 1.0976 - val_accuracy: 0.7233 - val_loss: 0.5397 Epoch 136/150 55/55 - 1s - 12ms/step - accuracy: 0.7196 - loss: 1.0972 - val_accuracy: 0.7233 - val_loss: 0.5400 Epoch 137/150 55/55 - 0s - 6ms/step - accuracy: 0.7200 - loss: 1.0967 - val_accuracy: 0.7228 - val_loss: 0.5402 Epoch 138/150 55/55 - 1s - 11ms/step - accuracy: 0.7206 - loss: 1.0964 - val_accuracy: 0.7228 - val_loss: 0.5404 Epoch 139/150 55/55 - 1s - 13ms/step - accuracy: 0.7199 - loss: 1.0959 - val_accuracy: 0.7211 - val_loss: 0.5410 Epoch 140/150 55/55 - 1s - 10ms/step - accuracy: 0.7203 - loss: 1.0955 - val_accuracy: 0.7211 - val_loss: 0.5408 Epoch 141/150 55/55 - 1s - 10ms/step - accuracy: 0.7199 - loss: 1.0951 - val_accuracy: 0.7206 - val_loss: 0.5410 Epoch 142/150 55/55 - 0s - 4ms/step - accuracy: 0.7204 - loss: 1.0946 - val_accuracy: 0.7200 - val_loss: 0.5415 Epoch 143/150 55/55 - 0s - 6ms/step - accuracy: 0.7174 - loss: 1.0942 - val_accuracy: 0.7222 - val_loss: 0.5390 Epoch 144/150 55/55 - 0s - 4ms/step - accuracy: 0.7219 - loss: 1.0938 - val_accuracy: 0.7206 - val_loss: 0.5409 Epoch 145/150 55/55 - 0s - 5ms/step - accuracy: 0.7181 - loss: 1.0934 - val_accuracy: 0.7211 - val_loss: 0.5397 Epoch 146/150 55/55 - 0s - 4ms/step - accuracy: 0.7204 - loss: 1.0929 - val_accuracy: 0.7211 - val_loss: 0.5399 Epoch 147/150 55/55 - 0s - 5ms/step - accuracy: 0.7187 - loss: 1.0925 - val_accuracy: 0.7211 - val_loss: 0.5397 Epoch 148/150 55/55 - 0s - 6ms/step - accuracy: 0.7196 - loss: 1.0921 - val_accuracy: 0.7206 - val_loss: 0.5398 Epoch 149/150 55/55 - 0s - 4ms/step - accuracy: 0.7164 - loss: 1.0917 - val_accuracy: 0.7222 - val_loss: 0.5372 Epoch 150/150 55/55 - 0s - 4ms/step - accuracy: 0.7216 - loss: 1.0913 - val_accuracy: 0.7200 - val_loss: 0.5384
#plot accuracy curve
plot(history6,'accuracy')
#observe loss curve
plot(history6,'loss')
#Observing result table
results.loc[6] = [2,[64,32],['relu','tanh'],150,128,"sgd",[0.001,0.3],"xavier","-",history6.history["loss"][-1],history6.history["val_loss"][-1],history6.history["accuracy"][-1],history6.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
#Observing model performance for training dataset
model_performance_classification(model6, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.719714 | 0.719714 | 0.809513 | 0.744784 |
#observing model performance for validation data
model_performance_classification(model6, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.72 | 0.72 | 0.801933 | 0.743146 |
Observations
- Perfromance decreased but oscillations reduced.
Model 7
Let's design model with change in momentum and chage in batch size and neuron numbers
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
- learning rate=0.001,momemtum =0.9
- batch size=32
#clear keras session
tf.keras.backend.clear_session()
#define model
model7 = Sequential()
model7.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model7.add(Dense(32,activation="tanh"))
model7.add(Dense(1,activation="sigmoid"))
#compile model
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001, momentum=0.9) # defining SGD as the optimizer to be used
model7.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training
start = time.time()
history7 = model7.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/150 219/219 - 3s - 12ms/step - accuracy: 0.6927 - loss: 1.2256 - val_accuracy: 0.6833 - val_loss: 0.6025 Epoch 2/150 219/219 - 1s - 3ms/step - accuracy: 0.7089 - loss: 1.1585 - val_accuracy: 0.7356 - val_loss: 0.5521 Epoch 3/150 219/219 - 1s - 3ms/step - accuracy: 0.7101 - loss: 1.1427 - val_accuracy: 0.6400 - val_loss: 0.6363 Epoch 4/150 219/219 - 1s - 3ms/step - accuracy: 0.7067 - loss: 1.1328 - val_accuracy: 0.7289 - val_loss: 0.5393 Epoch 5/150 219/219 - 1s - 4ms/step - accuracy: 0.7126 - loss: 1.1220 - val_accuracy: 0.6894 - val_loss: 0.5813 Epoch 6/150 219/219 - 1s - 5ms/step - accuracy: 0.7099 - loss: 1.1127 - val_accuracy: 0.6956 - val_loss: 0.5674 Epoch 7/150 219/219 - 1s - 3ms/step - accuracy: 0.7124 - loss: 1.1053 - val_accuracy: 0.7678 - val_loss: 0.4833 Epoch 8/150 219/219 - 1s - 6ms/step - accuracy: 0.7171 - loss: 1.0947 - val_accuracy: 0.6983 - val_loss: 0.5556 Epoch 9/150 219/219 - 1s - 6ms/step - accuracy: 0.7187 - loss: 1.0860 - val_accuracy: 0.7006 - val_loss: 0.5474 Epoch 10/150 219/219 - 1s - 7ms/step - accuracy: 0.7233 - loss: 1.0761 - val_accuracy: 0.6939 - val_loss: 0.5531 Epoch 11/150 219/219 - 1s - 5ms/step - accuracy: 0.7274 - loss: 1.0658 - val_accuracy: 0.7317 - val_loss: 0.5131 Epoch 12/150 219/219 - 1s - 5ms/step - accuracy: 0.7381 - loss: 1.0515 - val_accuracy: 0.7261 - val_loss: 0.5187 Epoch 13/150 219/219 - 1s - 5ms/step - accuracy: 0.7389 - loss: 1.0441 - val_accuracy: 0.7350 - val_loss: 0.5061 Epoch 14/150 219/219 - 1s - 5ms/step - accuracy: 0.7507 - loss: 1.0308 - val_accuracy: 0.7517 - val_loss: 0.4844 Epoch 15/150 219/219 - 1s - 3ms/step - accuracy: 0.7511 - loss: 1.0219 - val_accuracy: 0.7239 - val_loss: 0.5227 Epoch 16/150 219/219 - 1s - 3ms/step - accuracy: 0.7580 - loss: 1.0101 - val_accuracy: 0.7672 - val_loss: 0.4760 Epoch 17/150 219/219 - 1s - 3ms/step - accuracy: 0.7573 - loss: 1.0031 - val_accuracy: 0.7233 - val_loss: 0.5347 Epoch 18/150 219/219 - 1s - 3ms/step - accuracy: 0.7630 - loss: 0.9885 - val_accuracy: 0.7461 - val_loss: 0.4939 Epoch 19/150 219/219 - 1s - 3ms/step - accuracy: 0.7686 - loss: 0.9798 - val_accuracy: 0.7767 - val_loss: 0.4557 Epoch 20/150 219/219 - 1s - 3ms/step - accuracy: 0.7687 - loss: 0.9731 - val_accuracy: 0.7556 - val_loss: 0.4813 Epoch 21/150 219/219 - 1s - 3ms/step - accuracy: 0.7684 - loss: 0.9669 - val_accuracy: 0.7522 - val_loss: 0.4930 Epoch 22/150 219/219 - 1s - 3ms/step - accuracy: 0.7750 - loss: 0.9565 - val_accuracy: 0.7600 - val_loss: 0.4772 Epoch 23/150 219/219 - 1s - 2ms/step - accuracy: 0.7750 - loss: 0.9559 - val_accuracy: 0.7289 - val_loss: 0.5124 Epoch 24/150 219/219 - 1s - 3ms/step - accuracy: 0.7759 - loss: 0.9462 - val_accuracy: 0.7811 - val_loss: 0.4530 Epoch 25/150 219/219 - 1s - 3ms/step - accuracy: 0.7794 - loss: 0.9411 - val_accuracy: 0.7583 - val_loss: 0.4879 Epoch 26/150 219/219 - 1s - 3ms/step - accuracy: 0.7810 - loss: 0.9349 - val_accuracy: 0.7728 - val_loss: 0.4592 Epoch 27/150 219/219 - 1s - 3ms/step - accuracy: 0.7861 - loss: 0.9330 - val_accuracy: 0.7550 - val_loss: 0.4927 Epoch 28/150 219/219 - 1s - 3ms/step - accuracy: 0.7867 - loss: 0.9302 - val_accuracy: 0.7428 - val_loss: 0.5142 Epoch 29/150 219/219 - 1s - 3ms/step - accuracy: 0.7870 - loss: 0.9264 - val_accuracy: 0.7811 - val_loss: 0.4470 Epoch 30/150 219/219 - 1s - 3ms/step - accuracy: 0.7909 - loss: 0.9234 - val_accuracy: 0.7100 - val_loss: 0.5678 Epoch 31/150 219/219 - 1s - 4ms/step - accuracy: 0.7870 - loss: 0.9179 - val_accuracy: 0.7761 - val_loss: 0.4546 Epoch 32/150 219/219 - 1s - 5ms/step - accuracy: 0.7877 - loss: 0.9145 - val_accuracy: 0.8194 - val_loss: 0.4084 Epoch 33/150 219/219 - 1s - 6ms/step - accuracy: 0.7923 - loss: 0.9133 - val_accuracy: 0.8172 - val_loss: 0.4109 Epoch 34/150 219/219 - 1s - 5ms/step - accuracy: 0.7937 - loss: 0.9086 - val_accuracy: 0.7989 - val_loss: 0.4264 Epoch 35/150 219/219 - 1s - 3ms/step - accuracy: 0.7943 - loss: 0.9087 - val_accuracy: 0.7383 - val_loss: 0.5179 Epoch 36/150 219/219 - 1s - 3ms/step - accuracy: 0.7969 - loss: 0.9041 - val_accuracy: 0.7817 - val_loss: 0.4559 Epoch 37/150 219/219 - 1s - 3ms/step - accuracy: 0.7960 - loss: 0.9000 - val_accuracy: 0.7983 - val_loss: 0.4320 Epoch 38/150 219/219 - 1s - 3ms/step - accuracy: 0.7946 - loss: 0.9003 - val_accuracy: 0.7806 - val_loss: 0.4545 Epoch 39/150 219/219 - 1s - 3ms/step - accuracy: 0.7993 - loss: 0.8944 - val_accuracy: 0.7428 - val_loss: 0.5069 Epoch 40/150 219/219 - 1s - 3ms/step - accuracy: 0.7961 - loss: 0.8996 - val_accuracy: 0.7722 - val_loss: 0.4723 Epoch 41/150 219/219 - 1s - 3ms/step - accuracy: 0.7984 - loss: 0.8909 - val_accuracy: 0.7478 - val_loss: 0.5123 Epoch 42/150 219/219 - 1s - 3ms/step - accuracy: 0.7977 - loss: 0.8909 - val_accuracy: 0.7706 - val_loss: 0.4541 Epoch 43/150 219/219 - 1s - 3ms/step - accuracy: 0.8017 - loss: 0.8903 - val_accuracy: 0.8194 - val_loss: 0.4067 Epoch 44/150 219/219 - 1s - 3ms/step - accuracy: 0.8034 - loss: 0.8905 - val_accuracy: 0.7728 - val_loss: 0.4648 Epoch 45/150 219/219 - 1s - 3ms/step - accuracy: 0.8009 - loss: 0.8881 - val_accuracy: 0.8244 - val_loss: 0.4043 Epoch 46/150 219/219 - 1s - 3ms/step - accuracy: 0.8006 - loss: 0.8849 - val_accuracy: 0.7394 - val_loss: 0.5215 Epoch 47/150 219/219 - 1s - 3ms/step - accuracy: 0.8024 - loss: 0.8815 - val_accuracy: 0.7961 - val_loss: 0.4279 Epoch 48/150 219/219 - 1s - 3ms/step - accuracy: 0.8046 - loss: 0.8816 - val_accuracy: 0.7883 - val_loss: 0.4398 Epoch 49/150 219/219 - 1s - 4ms/step - accuracy: 0.8009 - loss: 0.8775 - val_accuracy: 0.8189 - val_loss: 0.4151 Epoch 50/150 219/219 - 1s - 5ms/step - accuracy: 0.8013 - loss: 0.8790 - val_accuracy: 0.8122 - val_loss: 0.4183 Epoch 51/150 219/219 - 1s - 4ms/step - accuracy: 0.8026 - loss: 0.8765 - val_accuracy: 0.8167 - val_loss: 0.4054 Epoch 52/150 219/219 - 1s - 4ms/step - accuracy: 0.8083 - loss: 0.8723 - val_accuracy: 0.7928 - val_loss: 0.4333 Epoch 53/150 219/219 - 1s - 4ms/step - accuracy: 0.8090 - loss: 0.8704 - val_accuracy: 0.8072 - val_loss: 0.4189 Epoch 54/150 219/219 - 1s - 3ms/step - accuracy: 0.8080 - loss: 0.8691 - val_accuracy: 0.7378 - val_loss: 0.5232 Epoch 55/150 219/219 - 1s - 2ms/step - accuracy: 0.8034 - loss: 0.8781 - val_accuracy: 0.8044 - val_loss: 0.4241 Epoch 56/150 219/219 - 1s - 3ms/step - accuracy: 0.8051 - loss: 0.8680 - val_accuracy: 0.7328 - val_loss: 0.5217 Epoch 57/150 219/219 - 1s - 3ms/step - accuracy: 0.8064 - loss: 0.8713 - val_accuracy: 0.8183 - val_loss: 0.4027 Epoch 58/150 219/219 - 1s - 3ms/step - accuracy: 0.8091 - loss: 0.8643 - val_accuracy: 0.7972 - val_loss: 0.4327 Epoch 59/150 219/219 - 1s - 3ms/step - accuracy: 0.8081 - loss: 0.8627 - val_accuracy: 0.7767 - val_loss: 0.4492 Epoch 60/150 219/219 - 1s - 3ms/step - accuracy: 0.8046 - loss: 0.8622 - val_accuracy: 0.7994 - val_loss: 0.4364 Epoch 61/150 219/219 - 1s - 3ms/step - accuracy: 0.8057 - loss: 0.8649 - val_accuracy: 0.7667 - val_loss: 0.4787 Epoch 62/150 219/219 - 1s - 3ms/step - accuracy: 0.8009 - loss: 0.8635 - val_accuracy: 0.8167 - val_loss: 0.4155 Epoch 63/150 219/219 - 1s - 3ms/step - accuracy: 0.8036 - loss: 0.8612 - val_accuracy: 0.8172 - val_loss: 0.4069 Epoch 64/150 219/219 - 1s - 3ms/step - accuracy: 0.8104 - loss: 0.8567 - val_accuracy: 0.7539 - val_loss: 0.4992 Epoch 65/150 219/219 - 1s - 3ms/step - accuracy: 0.8101 - loss: 0.8571 - val_accuracy: 0.7956 - val_loss: 0.4390 Epoch 66/150 219/219 - 1s - 3ms/step - accuracy: 0.8084 - loss: 0.8560 - val_accuracy: 0.8033 - val_loss: 0.4210 Epoch 67/150 219/219 - 1s - 2ms/step - accuracy: 0.8060 - loss: 0.8585 - val_accuracy: 0.7911 - val_loss: 0.4392 Epoch 68/150 219/219 - 1s - 3ms/step - accuracy: 0.8093 - loss: 0.8597 - val_accuracy: 0.7683 - val_loss: 0.4728 Epoch 69/150 219/219 - 1s - 4ms/step - accuracy: 0.8076 - loss: 0.8529 - val_accuracy: 0.8228 - val_loss: 0.3909 Epoch 70/150 219/219 - 1s - 6ms/step - accuracy: 0.8100 - loss: 0.8552 - val_accuracy: 0.7750 - val_loss: 0.4601 Epoch 71/150 219/219 - 1s - 6ms/step - accuracy: 0.8099 - loss: 0.8514 - val_accuracy: 0.8311 - val_loss: 0.3901 Epoch 72/150 219/219 - 1s - 4ms/step - accuracy: 0.8100 - loss: 0.8461 - val_accuracy: 0.7883 - val_loss: 0.4661 Epoch 73/150 219/219 - 1s - 3ms/step - accuracy: 0.8107 - loss: 0.8481 - val_accuracy: 0.7939 - val_loss: 0.4355 Epoch 74/150 219/219 - 1s - 3ms/step - accuracy: 0.8063 - loss: 0.8502 - val_accuracy: 0.8422 - val_loss: 0.3717 Epoch 75/150 219/219 - 1s - 3ms/step - accuracy: 0.8094 - loss: 0.8498 - val_accuracy: 0.7922 - val_loss: 0.4407 Epoch 76/150 219/219 - 1s - 2ms/step - accuracy: 0.8099 - loss: 0.8411 - val_accuracy: 0.7906 - val_loss: 0.4429 Epoch 77/150 219/219 - 1s - 3ms/step - accuracy: 0.8120 - loss: 0.8430 - val_accuracy: 0.7756 - val_loss: 0.4622 Epoch 78/150 219/219 - 1s - 3ms/step - accuracy: 0.8067 - loss: 0.8476 - val_accuracy: 0.7828 - val_loss: 0.4442 Epoch 79/150 219/219 - 1s - 3ms/step - accuracy: 0.8100 - loss: 0.8428 - val_accuracy: 0.8033 - val_loss: 0.4285 Epoch 80/150 219/219 - 1s - 3ms/step - accuracy: 0.8040 - loss: 0.8429 - val_accuracy: 0.7761 - val_loss: 0.4704 Epoch 81/150 219/219 - 1s - 3ms/step - accuracy: 0.8131 - loss: 0.8393 - val_accuracy: 0.7750 - val_loss: 0.4618 Epoch 82/150 219/219 - 1s - 3ms/step - accuracy: 0.8117 - loss: 0.8358 - val_accuracy: 0.7383 - val_loss: 0.5201 Epoch 83/150 219/219 - 1s - 3ms/step - accuracy: 0.8113 - loss: 0.8398 - val_accuracy: 0.7289 - val_loss: 0.5412 Epoch 84/150 219/219 - 1s - 3ms/step - accuracy: 0.8139 - loss: 0.8416 - val_accuracy: 0.8028 - val_loss: 0.4187 Epoch 85/150 219/219 - 1s - 3ms/step - accuracy: 0.8127 - loss: 0.8381 - val_accuracy: 0.8094 - val_loss: 0.4198 Epoch 86/150 219/219 - 1s - 3ms/step - accuracy: 0.8103 - loss: 0.8360 - val_accuracy: 0.7878 - val_loss: 0.4499 Epoch 87/150 219/219 - 1s - 3ms/step - accuracy: 0.8106 - loss: 0.8374 - val_accuracy: 0.8228 - val_loss: 0.4007 Epoch 88/150 219/219 - 1s - 5ms/step - accuracy: 0.8177 - loss: 0.8359 - val_accuracy: 0.7583 - val_loss: 0.4919 Epoch 89/150 219/219 - 1s - 5ms/step - accuracy: 0.8141 - loss: 0.8295 - val_accuracy: 0.8100 - val_loss: 0.4142 Epoch 90/150 219/219 - 1s - 6ms/step - accuracy: 0.8094 - loss: 0.8391 - val_accuracy: 0.7522 - val_loss: 0.5000 Epoch 91/150 219/219 - 1s - 5ms/step - accuracy: 0.8186 - loss: 0.8279 - val_accuracy: 0.7217 - val_loss: 0.5435 Epoch 92/150 219/219 - 1s - 3ms/step - accuracy: 0.8097 - loss: 0.8341 - val_accuracy: 0.8011 - val_loss: 0.4321 Epoch 93/150 219/219 - 1s - 3ms/step - accuracy: 0.8177 - loss: 0.8262 - val_accuracy: 0.7872 - val_loss: 0.4508 Epoch 94/150 219/219 - 1s - 2ms/step - accuracy: 0.8131 - loss: 0.8273 - val_accuracy: 0.7183 - val_loss: 0.5555 Epoch 95/150 219/219 - 1s - 3ms/step - accuracy: 0.8116 - loss: 0.8306 - val_accuracy: 0.7900 - val_loss: 0.4495 Epoch 96/150 219/219 - 1s - 3ms/step - accuracy: 0.8184 - loss: 0.8295 - val_accuracy: 0.7556 - val_loss: 0.4975 Epoch 97/150 219/219 - 1s - 3ms/step - accuracy: 0.8134 - loss: 0.8240 - val_accuracy: 0.7811 - val_loss: 0.4699 Epoch 98/150 219/219 - 1s - 3ms/step - accuracy: 0.8147 - loss: 0.8258 - val_accuracy: 0.7817 - val_loss: 0.4593 Epoch 99/150 219/219 - 1s - 3ms/step - accuracy: 0.8156 - loss: 0.8193 - val_accuracy: 0.7628 - val_loss: 0.4869 Epoch 100/150 219/219 - 1s - 3ms/step - accuracy: 0.8170 - loss: 0.8204 - val_accuracy: 0.7667 - val_loss: 0.4796 Epoch 101/150 219/219 - 1s - 3ms/step - accuracy: 0.8196 - loss: 0.8239 - val_accuracy: 0.7894 - val_loss: 0.4550 Epoch 102/150 219/219 - 1s - 3ms/step - accuracy: 0.8130 - loss: 0.8172 - val_accuracy: 0.7856 - val_loss: 0.4508 Epoch 103/150 219/219 - 1s - 2ms/step - accuracy: 0.8211 - loss: 0.8196 - val_accuracy: 0.7700 - val_loss: 0.4820 Epoch 104/150 219/219 - 1s - 3ms/step - accuracy: 0.8184 - loss: 0.8148 - val_accuracy: 0.8317 - val_loss: 0.3794 Epoch 105/150 219/219 - 1s - 3ms/step - accuracy: 0.8189 - loss: 0.8097 - val_accuracy: 0.7878 - val_loss: 0.4501 Epoch 106/150 219/219 - 1s - 4ms/step - accuracy: 0.8147 - loss: 0.8174 - val_accuracy: 0.8200 - val_loss: 0.4150 Epoch 107/150 219/219 - 1s - 6ms/step - accuracy: 0.8180 - loss: 0.8137 - val_accuracy: 0.7928 - val_loss: 0.4480 Epoch 108/150 219/219 - 2s - 7ms/step - accuracy: 0.8164 - loss: 0.8181 - val_accuracy: 0.7922 - val_loss: 0.4403 Epoch 109/150 219/219 - 1s - 3ms/step - accuracy: 0.8216 - loss: 0.8080 - val_accuracy: 0.7644 - val_loss: 0.4720 Epoch 110/150 219/219 - 1s - 2ms/step - accuracy: 0.8221 - loss: 0.8101 - val_accuracy: 0.7606 - val_loss: 0.5020 Epoch 111/150 219/219 - 1s - 3ms/step - accuracy: 0.8140 - loss: 0.8154 - val_accuracy: 0.8061 - val_loss: 0.4287 Epoch 112/150 219/219 - 1s - 3ms/step - accuracy: 0.8181 - loss: 0.8096 - val_accuracy: 0.7794 - val_loss: 0.4634 Epoch 113/150 219/219 - 1s - 3ms/step - accuracy: 0.8241 - loss: 0.8055 - val_accuracy: 0.7517 - val_loss: 0.5118 Epoch 114/150 219/219 - 1s - 3ms/step - accuracy: 0.8187 - loss: 0.8055 - val_accuracy: 0.8011 - val_loss: 0.4332 Epoch 115/150 219/219 - 1s - 3ms/step - accuracy: 0.8236 - loss: 0.8045 - val_accuracy: 0.7717 - val_loss: 0.4827 Epoch 116/150 219/219 - 1s - 3ms/step - accuracy: 0.8167 - loss: 0.8047 - val_accuracy: 0.8089 - val_loss: 0.4278 Epoch 117/150 219/219 - 1s - 3ms/step - accuracy: 0.8180 - loss: 0.8083 - val_accuracy: 0.7456 - val_loss: 0.5120 Epoch 118/150 219/219 - 1s - 3ms/step - accuracy: 0.8199 - loss: 0.8056 - val_accuracy: 0.8233 - val_loss: 0.3960 Epoch 119/150 219/219 - 1s - 3ms/step - accuracy: 0.8154 - loss: 0.8126 - val_accuracy: 0.7433 - val_loss: 0.5311 Epoch 120/150 219/219 - 1s - 3ms/step - accuracy: 0.8234 - loss: 0.7929 - val_accuracy: 0.7956 - val_loss: 0.4589 Epoch 121/150 219/219 - 1s - 3ms/step - accuracy: 0.8233 - loss: 0.7945 - val_accuracy: 0.8000 - val_loss: 0.4375 Epoch 122/150 219/219 - 1s - 3ms/step - accuracy: 0.8236 - loss: 0.7985 - val_accuracy: 0.7917 - val_loss: 0.4573 Epoch 123/150 219/219 - 1s - 3ms/step - accuracy: 0.8214 - loss: 0.7935 - val_accuracy: 0.7422 - val_loss: 0.5249 Epoch 124/150 219/219 - 1s - 3ms/step - accuracy: 0.8207 - loss: 0.8008 - val_accuracy: 0.7783 - val_loss: 0.4723 Epoch 125/150 219/219 - 1s - 4ms/step - accuracy: 0.8217 - loss: 0.7960 - val_accuracy: 0.7467 - val_loss: 0.5116 Epoch 126/150 219/219 - 1s - 5ms/step - accuracy: 0.8204 - loss: 0.7878 - val_accuracy: 0.8189 - val_loss: 0.4006 Epoch 127/150 219/219 - 1s - 6ms/step - accuracy: 0.8256 - loss: 0.7986 - val_accuracy: 0.8367 - val_loss: 0.3849 Epoch 128/150 219/219 - 1s - 4ms/step - accuracy: 0.8266 - loss: 0.7856 - val_accuracy: 0.8000 - val_loss: 0.4442 Epoch 129/150 219/219 - 1s - 3ms/step - accuracy: 0.8231 - loss: 0.7911 - val_accuracy: 0.7817 - val_loss: 0.4645 Epoch 130/150 219/219 - 1s - 3ms/step - accuracy: 0.8253 - loss: 0.7872 - val_accuracy: 0.7822 - val_loss: 0.4667 Epoch 131/150 219/219 - 1s - 3ms/step - accuracy: 0.8241 - loss: 0.7868 - val_accuracy: 0.7706 - val_loss: 0.4794 Epoch 132/150 219/219 - 1s - 3ms/step - accuracy: 0.8211 - loss: 0.7895 - val_accuracy: 0.8156 - val_loss: 0.4210 Epoch 133/150 219/219 - 1s - 3ms/step - accuracy: 0.8210 - loss: 0.7872 - val_accuracy: 0.8050 - val_loss: 0.4322 Epoch 134/150 219/219 - 1s - 2ms/step - accuracy: 0.8264 - loss: 0.7827 - val_accuracy: 0.7950 - val_loss: 0.4414 Epoch 135/150 219/219 - 1s - 3ms/step - accuracy: 0.8249 - loss: 0.7846 - val_accuracy: 0.7806 - val_loss: 0.4669 Epoch 136/150 219/219 - 1s - 3ms/step - accuracy: 0.8236 - loss: 0.7730 - val_accuracy: 0.7622 - val_loss: 0.5103 Epoch 137/150 219/219 - 1s - 3ms/step - accuracy: 0.8274 - loss: 0.7744 - val_accuracy: 0.7878 - val_loss: 0.4585 Epoch 138/150 219/219 - 1s - 3ms/step - accuracy: 0.8213 - loss: 0.7854 - val_accuracy: 0.7978 - val_loss: 0.4494 Epoch 139/150 219/219 - 1s - 3ms/step - accuracy: 0.8220 - loss: 0.7796 - val_accuracy: 0.8244 - val_loss: 0.3891 Epoch 140/150 219/219 - 1s - 3ms/step - accuracy: 0.8237 - loss: 0.7892 - val_accuracy: 0.8033 - val_loss: 0.4384 Epoch 141/150 219/219 - 1s - 3ms/step - accuracy: 0.8279 - loss: 0.7723 - val_accuracy: 0.7650 - val_loss: 0.5058 Epoch 142/150 219/219 - 1s - 3ms/step - accuracy: 0.8213 - loss: 0.7805 - val_accuracy: 0.7828 - val_loss: 0.4758 Epoch 143/150 219/219 - 1s - 3ms/step - accuracy: 0.8239 - loss: 0.7729 - val_accuracy: 0.7944 - val_loss: 0.4511 Epoch 144/150 219/219 - 1s - 3ms/step - accuracy: 0.8273 - loss: 0.7722 - val_accuracy: 0.7389 - val_loss: 0.5224 Epoch 145/150 219/219 - 1s - 4ms/step - accuracy: 0.8270 - loss: 0.7738 - val_accuracy: 0.8006 - val_loss: 0.4500 Epoch 146/150 219/219 - 1s - 5ms/step - accuracy: 0.8280 - loss: 0.7694 - val_accuracy: 0.7722 - val_loss: 0.4781 Epoch 147/150 219/219 - 1s - 6ms/step - accuracy: 0.8307 - loss: 0.7697 - val_accuracy: 0.7278 - val_loss: 0.5485 Epoch 148/150 219/219 - 1s - 4ms/step - accuracy: 0.8299 - loss: 0.7676 - val_accuracy: 0.7878 - val_loss: 0.4682 Epoch 149/150 219/219 - 1s - 3ms/step - accuracy: 0.8324 - loss: 0.7662 - val_accuracy: 0.7956 - val_loss: 0.4503 Epoch 150/150 219/219 - 1s - 3ms/step - accuracy: 0.8269 - loss: 0.7695 - val_accuracy: 0.7783 - val_loss: 0.4684
#plot accuracy curve
plot(history7,'accuracy')
#plot loss curve
plot(history7,'loss')
#Comapring all results
results.loc[7] = [2,[64,32],['relu','tanh'],150,32,"sgd",[0.001,0.9],"xavier","-",history7.history["loss"][-1],history7.history["val_loss"][-1],history7.history["accuracy"][-1],history7.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
#model performance
model_performance_classification(model7, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.829143 | 0.829143 | 0.873458 | 0.841088 |
#model performance for validation
model_performance_classification(model7, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.778333 | 0.778333 | 0.827245 | 0.793238 |
Observations
- Model loss has reduced and improved training accuracy but not validationaccuracy more oscillations.
- Let's make few changes with respect to number of epoches and learning rate and momemtum
Model 8
Let's design model with change in momentum and chage in batch size and neuron numbers
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
- learning rate=0.001,momemtum =0.7
- batch size=32
- epoch=150
#clear keras session
tf.keras.backend.clear_session()
#define model
model8 = Sequential()
model8.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model8.add(Dense(32,activation="tanh"))
model8.add(Dense(1,activation="sigmoid"))
#model summary
model8.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#Complie model
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001, momentum=0.7) # defining SGD as the optimizer to be used
model8.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#Start training
start = time.time()
history8 = model8.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/150 219/219 - 2s - 9ms/step - accuracy: 0.5714 - loss: 1.3326 - val_accuracy: 0.6511 - val_loss: 0.6335 Epoch 2/150 219/219 - 1s - 3ms/step - accuracy: 0.6820 - loss: 1.2539 - val_accuracy: 0.6811 - val_loss: 0.6168 Epoch 3/150 219/219 - 1s - 3ms/step - accuracy: 0.6960 - loss: 1.2168 - val_accuracy: 0.7028 - val_loss: 0.5987 Epoch 4/150 219/219 - 1s - 3ms/step - accuracy: 0.7024 - loss: 1.1949 - val_accuracy: 0.7378 - val_loss: 0.5635 Epoch 5/150 219/219 - 1s - 3ms/step - accuracy: 0.7224 - loss: 1.1798 - val_accuracy: 0.6978 - val_loss: 0.5977 Epoch 6/150 219/219 - 1s - 3ms/step - accuracy: 0.7093 - loss: 1.1707 - val_accuracy: 0.7200 - val_loss: 0.5717 Epoch 7/150 219/219 - 1s - 3ms/step - accuracy: 0.7097 - loss: 1.1619 - val_accuracy: 0.7256 - val_loss: 0.5622 Epoch 8/150 219/219 - 1s - 3ms/step - accuracy: 0.7136 - loss: 1.1559 - val_accuracy: 0.7378 - val_loss: 0.5482 Epoch 9/150 219/219 - 1s - 3ms/step - accuracy: 0.7160 - loss: 1.1508 - val_accuracy: 0.7283 - val_loss: 0.5548 Epoch 10/150 219/219 - 1s - 3ms/step - accuracy: 0.7119 - loss: 1.1462 - val_accuracy: 0.7156 - val_loss: 0.5632 Epoch 11/150 219/219 - 1s - 3ms/step - accuracy: 0.7111 - loss: 1.1421 - val_accuracy: 0.7194 - val_loss: 0.5548 Epoch 12/150 219/219 - 1s - 3ms/step - accuracy: 0.7154 - loss: 1.1388 - val_accuracy: 0.7161 - val_loss: 0.5604 Epoch 13/150 219/219 - 1s - 3ms/step - accuracy: 0.7107 - loss: 1.1352 - val_accuracy: 0.7278 - val_loss: 0.5447 Epoch 14/150 219/219 - 1s - 4ms/step - accuracy: 0.7110 - loss: 1.1323 - val_accuracy: 0.7278 - val_loss: 0.5416 Epoch 15/150 219/219 - 1s - 6ms/step - accuracy: 0.7109 - loss: 1.1296 - val_accuracy: 0.7294 - val_loss: 0.5410 Epoch 16/150 219/219 - 1s - 6ms/step - accuracy: 0.7103 - loss: 1.1268 - val_accuracy: 0.7339 - val_loss: 0.5327 Epoch 17/150 219/219 - 1s - 2ms/step - accuracy: 0.7121 - loss: 1.1241 - val_accuracy: 0.7311 - val_loss: 0.5399 Epoch 18/150 219/219 - 1s - 3ms/step - accuracy: 0.7093 - loss: 1.1216 - val_accuracy: 0.7322 - val_loss: 0.5371 Epoch 19/150 219/219 - 1s - 3ms/step - accuracy: 0.7091 - loss: 1.1190 - val_accuracy: 0.7233 - val_loss: 0.5443 Epoch 20/150 219/219 - 1s - 3ms/step - accuracy: 0.7099 - loss: 1.1163 - val_accuracy: 0.7222 - val_loss: 0.5441 Epoch 21/150 219/219 - 1s - 3ms/step - accuracy: 0.7110 - loss: 1.1140 - val_accuracy: 0.7139 - val_loss: 0.5493 Epoch 22/150 219/219 - 1s - 3ms/step - accuracy: 0.7077 - loss: 1.1111 - val_accuracy: 0.7167 - val_loss: 0.5467 Epoch 23/150 219/219 - 1s - 3ms/step - accuracy: 0.7104 - loss: 1.1085 - val_accuracy: 0.7261 - val_loss: 0.5355 Epoch 24/150 219/219 - 1s - 3ms/step - accuracy: 0.7094 - loss: 1.1059 - val_accuracy: 0.7211 - val_loss: 0.5366 Epoch 25/150 219/219 - 1s - 3ms/step - accuracy: 0.7091 - loss: 1.1024 - val_accuracy: 0.7422 - val_loss: 0.5151 Epoch 26/150 219/219 - 1s - 2ms/step - accuracy: 0.7154 - loss: 1.1012 - val_accuracy: 0.7150 - val_loss: 0.5459 Epoch 27/150 219/219 - 1s - 3ms/step - accuracy: 0.7160 - loss: 1.0978 - val_accuracy: 0.7089 - val_loss: 0.5539 Epoch 28/150 219/219 - 1s - 2ms/step - accuracy: 0.7093 - loss: 1.0959 - val_accuracy: 0.7350 - val_loss: 0.5244 Epoch 29/150 219/219 - 1s - 3ms/step - accuracy: 0.7147 - loss: 1.0919 - val_accuracy: 0.7372 - val_loss: 0.5175 Epoch 30/150 219/219 - 1s - 3ms/step - accuracy: 0.7179 - loss: 1.0911 - val_accuracy: 0.7217 - val_loss: 0.5349 Epoch 31/150 219/219 - 1s - 3ms/step - accuracy: 0.7186 - loss: 1.0870 - val_accuracy: 0.7400 - val_loss: 0.5139 Epoch 32/150 219/219 - 1s - 3ms/step - accuracy: 0.7231 - loss: 1.0844 - val_accuracy: 0.7144 - val_loss: 0.5467 Epoch 33/150 219/219 - 1s - 3ms/step - accuracy: 0.7213 - loss: 1.0819 - val_accuracy: 0.7289 - val_loss: 0.5275 Epoch 34/150 219/219 - 1s - 4ms/step - accuracy: 0.7224 - loss: 1.0784 - val_accuracy: 0.7300 - val_loss: 0.5277 Epoch 35/150 219/219 - 1s - 4ms/step - accuracy: 0.7243 - loss: 1.0757 - val_accuracy: 0.7272 - val_loss: 0.5314 Epoch 36/150 219/219 - 1s - 6ms/step - accuracy: 0.7260 - loss: 1.0719 - val_accuracy: 0.7383 - val_loss: 0.5184 Epoch 37/150 219/219 - 1s - 3ms/step - accuracy: 0.7333 - loss: 1.0690 - val_accuracy: 0.7228 - val_loss: 0.5374 Epoch 38/150 219/219 - 1s - 3ms/step - accuracy: 0.7317 - loss: 1.0652 - val_accuracy: 0.7228 - val_loss: 0.5356 Epoch 39/150 219/219 - 1s - 3ms/step - accuracy: 0.7323 - loss: 1.0620 - val_accuracy: 0.7383 - val_loss: 0.5219 Epoch 40/150 219/219 - 1s - 3ms/step - accuracy: 0.7389 - loss: 1.0585 - val_accuracy: 0.7239 - val_loss: 0.5367 Epoch 41/150 219/219 - 1s - 3ms/step - accuracy: 0.7347 - loss: 1.0553 - val_accuracy: 0.7367 - val_loss: 0.5222 Epoch 42/150 219/219 - 1s - 3ms/step - accuracy: 0.7400 - loss: 1.0511 - val_accuracy: 0.7389 - val_loss: 0.5174 Epoch 43/150 219/219 - 1s - 3ms/step - accuracy: 0.7419 - loss: 1.0486 - val_accuracy: 0.7411 - val_loss: 0.5160 Epoch 44/150 219/219 - 1s - 2ms/step - accuracy: 0.7396 - loss: 1.0441 - val_accuracy: 0.7489 - val_loss: 0.5021 Epoch 45/150 219/219 - 1s - 3ms/step - accuracy: 0.7460 - loss: 1.0405 - val_accuracy: 0.7422 - val_loss: 0.5150 Epoch 46/150 219/219 - 1s - 3ms/step - accuracy: 0.7500 - loss: 1.0373 - val_accuracy: 0.7306 - val_loss: 0.5250 Epoch 47/150 219/219 - 1s - 3ms/step - accuracy: 0.7504 - loss: 1.0330 - val_accuracy: 0.7206 - val_loss: 0.5369 Epoch 48/150 219/219 - 1s - 2ms/step - accuracy: 0.7457 - loss: 1.0299 - val_accuracy: 0.7583 - val_loss: 0.4869 Epoch 49/150 219/219 - 1s - 3ms/step - accuracy: 0.7567 - loss: 1.0269 - val_accuracy: 0.7400 - val_loss: 0.5155 Epoch 50/150 219/219 - 1s - 3ms/step - accuracy: 0.7520 - loss: 1.0221 - val_accuracy: 0.7533 - val_loss: 0.5009 Epoch 51/150 219/219 - 1s - 3ms/step - accuracy: 0.7556 - loss: 1.0188 - val_accuracy: 0.7533 - val_loss: 0.4981 Epoch 52/150 219/219 - 1s - 2ms/step - accuracy: 0.7571 - loss: 1.0152 - val_accuracy: 0.7522 - val_loss: 0.5007 Epoch 53/150 219/219 - 1s - 3ms/step - accuracy: 0.7567 - loss: 1.0112 - val_accuracy: 0.7594 - val_loss: 0.4964 Epoch 54/150 219/219 - 1s - 5ms/step - accuracy: 0.7619 - loss: 1.0076 - val_accuracy: 0.7533 - val_loss: 0.5026 Epoch 55/150 219/219 - 1s - 5ms/step - accuracy: 0.7639 - loss: 1.0047 - val_accuracy: 0.7406 - val_loss: 0.5124 Epoch 56/150 219/219 - 1s - 6ms/step - accuracy: 0.7621 - loss: 1.0009 - val_accuracy: 0.7517 - val_loss: 0.5038 Epoch 57/150 219/219 - 1s - 5ms/step - accuracy: 0.7623 - loss: 0.9972 - val_accuracy: 0.7561 - val_loss: 0.4980 Epoch 58/150 219/219 - 1s - 2ms/step - accuracy: 0.7640 - loss: 0.9946 - val_accuracy: 0.7628 - val_loss: 0.4883 Epoch 59/150 219/219 - 1s - 2ms/step - accuracy: 0.7709 - loss: 0.9915 - val_accuracy: 0.7494 - val_loss: 0.5063 Epoch 60/150 219/219 - 1s - 3ms/step - accuracy: 0.7660 - loss: 0.9874 - val_accuracy: 0.7683 - val_loss: 0.4838 Epoch 61/150 219/219 - 1s - 3ms/step - accuracy: 0.7687 - loss: 0.9855 - val_accuracy: 0.7617 - val_loss: 0.4899 Epoch 62/150 219/219 - 1s - 3ms/step - accuracy: 0.7673 - loss: 0.9819 - val_accuracy: 0.7711 - val_loss: 0.4812 Epoch 63/150 219/219 - 1s - 3ms/step - accuracy: 0.7693 - loss: 0.9786 - val_accuracy: 0.7722 - val_loss: 0.4789 Epoch 64/150 219/219 - 1s - 3ms/step - accuracy: 0.7720 - loss: 0.9763 - val_accuracy: 0.7689 - val_loss: 0.4817 Epoch 65/150 219/219 - 1s - 3ms/step - accuracy: 0.7716 - loss: 0.9720 - val_accuracy: 0.7739 - val_loss: 0.4803 Epoch 66/150 219/219 - 1s - 3ms/step - accuracy: 0.7707 - loss: 0.9709 - val_accuracy: 0.7606 - val_loss: 0.4918 Epoch 67/150 219/219 - 1s - 3ms/step - accuracy: 0.7736 - loss: 0.9672 - val_accuracy: 0.7556 - val_loss: 0.4979 Epoch 68/150 219/219 - 1s - 3ms/step - accuracy: 0.7744 - loss: 0.9648 - val_accuracy: 0.7578 - val_loss: 0.4949 Epoch 69/150 219/219 - 1s - 3ms/step - accuracy: 0.7721 - loss: 0.9618 - val_accuracy: 0.7694 - val_loss: 0.4836 Epoch 70/150 219/219 - 1s - 3ms/step - accuracy: 0.7746 - loss: 0.9597 - val_accuracy: 0.7789 - val_loss: 0.4751 Epoch 71/150 219/219 - 1s - 3ms/step - accuracy: 0.7747 - loss: 0.9572 - val_accuracy: 0.7667 - val_loss: 0.4814 Epoch 72/150 219/219 - 1s - 3ms/step - accuracy: 0.7751 - loss: 0.9549 - val_accuracy: 0.7839 - val_loss: 0.4703 Epoch 73/150 219/219 - 1s - 4ms/step - accuracy: 0.7809 - loss: 0.9521 - val_accuracy: 0.7572 - val_loss: 0.4987 Epoch 74/150 219/219 - 1s - 6ms/step - accuracy: 0.7729 - loss: 0.9475 - val_accuracy: 0.7972 - val_loss: 0.4359 Epoch 75/150 219/219 - 1s - 5ms/step - accuracy: 0.7831 - loss: 0.9481 - val_accuracy: 0.7772 - val_loss: 0.4753 Epoch 76/150 219/219 - 1s - 2ms/step - accuracy: 0.7776 - loss: 0.9457 - val_accuracy: 0.7844 - val_loss: 0.4648 Epoch 77/150 219/219 - 1s - 2ms/step - accuracy: 0.7803 - loss: 0.9434 - val_accuracy: 0.7856 - val_loss: 0.4656 Epoch 78/150 219/219 - 1s - 3ms/step - accuracy: 0.7813 - loss: 0.9411 - val_accuracy: 0.7811 - val_loss: 0.4656 Epoch 79/150 219/219 - 1s - 3ms/step - accuracy: 0.7799 - loss: 0.9383 - val_accuracy: 0.7694 - val_loss: 0.4856 Epoch 80/150 219/219 - 1s - 3ms/step - accuracy: 0.7809 - loss: 0.9373 - val_accuracy: 0.7722 - val_loss: 0.4805 Epoch 81/150 219/219 - 1s - 2ms/step - accuracy: 0.7816 - loss: 0.9342 - val_accuracy: 0.7817 - val_loss: 0.4674 Epoch 82/150 219/219 - 1s - 3ms/step - accuracy: 0.7809 - loss: 0.9339 - val_accuracy: 0.7783 - val_loss: 0.4699 Epoch 83/150 219/219 - 1s - 2ms/step - accuracy: 0.7806 - loss: 0.9318 - val_accuracy: 0.7817 - val_loss: 0.4648 Epoch 84/150 219/219 - 1s - 3ms/step - accuracy: 0.7833 - loss: 0.9289 - val_accuracy: 0.7950 - val_loss: 0.4473 Epoch 85/150 219/219 - 1s - 3ms/step - accuracy: 0.7839 - loss: 0.9268 - val_accuracy: 0.7639 - val_loss: 0.4858 Epoch 86/150 219/219 - 1s - 2ms/step - accuracy: 0.7827 - loss: 0.9257 - val_accuracy: 0.7689 - val_loss: 0.4802 Epoch 87/150 219/219 - 1s - 3ms/step - accuracy: 0.7844 - loss: 0.9240 - val_accuracy: 0.7967 - val_loss: 0.4432 Epoch 88/150 219/219 - 1s - 2ms/step - accuracy: 0.7861 - loss: 0.9223 - val_accuracy: 0.7833 - val_loss: 0.4591 Epoch 89/150 219/219 - 1s - 3ms/step - accuracy: 0.7854 - loss: 0.9204 - val_accuracy: 0.7556 - val_loss: 0.4946 Epoch 90/150 219/219 - 1s - 3ms/step - accuracy: 0.7841 - loss: 0.9201 - val_accuracy: 0.7517 - val_loss: 0.4973 Epoch 91/150 219/219 - 1s - 3ms/step - accuracy: 0.7843 - loss: 0.9176 - val_accuracy: 0.7978 - val_loss: 0.4473 Epoch 92/150 219/219 - 1s - 4ms/step - accuracy: 0.7864 - loss: 0.9165 - val_accuracy: 0.7828 - val_loss: 0.4670 Epoch 93/150 219/219 - 1s - 5ms/step - accuracy: 0.7916 - loss: 0.9153 - val_accuracy: 0.7683 - val_loss: 0.4744 Epoch 94/150 219/219 - 1s - 6ms/step - accuracy: 0.7864 - loss: 0.9133 - val_accuracy: 0.7700 - val_loss: 0.4808 Epoch 95/150 219/219 - 1s - 5ms/step - accuracy: 0.7837 - loss: 0.9121 - val_accuracy: 0.7983 - val_loss: 0.4347 Epoch 96/150 219/219 - 1s - 3ms/step - accuracy: 0.7894 - loss: 0.9113 - val_accuracy: 0.7878 - val_loss: 0.4592 Epoch 97/150 219/219 - 1s - 3ms/step - accuracy: 0.7897 - loss: 0.9097 - val_accuracy: 0.7794 - val_loss: 0.4558 Epoch 98/150 219/219 - 1s - 3ms/step - accuracy: 0.7910 - loss: 0.9094 - val_accuracy: 0.7839 - val_loss: 0.4538 Epoch 99/150 219/219 - 1s - 3ms/step - accuracy: 0.7931 - loss: 0.9072 - val_accuracy: 0.7639 - val_loss: 0.4827 Epoch 100/150 219/219 - 1s - 3ms/step - accuracy: 0.7856 - loss: 0.9053 - val_accuracy: 0.8006 - val_loss: 0.4389 Epoch 101/150 219/219 - 1s - 3ms/step - accuracy: 0.7927 - loss: 0.9055 - val_accuracy: 0.7778 - val_loss: 0.4731 Epoch 102/150 219/219 - 1s - 3ms/step - accuracy: 0.7946 - loss: 0.9027 - val_accuracy: 0.7633 - val_loss: 0.4824 Epoch 103/150 219/219 - 1s - 3ms/step - accuracy: 0.7941 - loss: 0.9003 - val_accuracy: 0.7756 - val_loss: 0.4712 Epoch 104/150 219/219 - 1s - 3ms/step - accuracy: 0.7923 - loss: 0.9024 - val_accuracy: 0.7661 - val_loss: 0.4817 Epoch 105/150 219/219 - 1s - 2ms/step - accuracy: 0.7939 - loss: 0.8996 - val_accuracy: 0.7778 - val_loss: 0.4628 Epoch 106/150 219/219 - 1s - 3ms/step - accuracy: 0.7941 - loss: 0.8977 - val_accuracy: 0.7906 - val_loss: 0.4522 Epoch 107/150 219/219 - 1s - 3ms/step - accuracy: 0.7963 - loss: 0.8980 - val_accuracy: 0.8061 - val_loss: 0.4305 Epoch 108/150 219/219 - 1s - 3ms/step - accuracy: 0.7974 - loss: 0.8981 - val_accuracy: 0.7606 - val_loss: 0.4920 Epoch 109/150 219/219 - 1s - 3ms/step - accuracy: 0.7923 - loss: 0.8947 - val_accuracy: 0.7839 - val_loss: 0.4492 Epoch 110/150 219/219 - 1s - 3ms/step - accuracy: 0.7941 - loss: 0.8935 - val_accuracy: 0.7750 - val_loss: 0.4616 Epoch 111/150 219/219 - 1s - 4ms/step - accuracy: 0.7926 - loss: 0.8922 - val_accuracy: 0.7767 - val_loss: 0.4616 Epoch 112/150 219/219 - 1s - 4ms/step - accuracy: 0.7939 - loss: 0.8919 - val_accuracy: 0.7956 - val_loss: 0.4380 Epoch 113/150 219/219 - 1s - 7ms/step - accuracy: 0.7946 - loss: 0.8891 - val_accuracy: 0.7739 - val_loss: 0.4688 Epoch 114/150 219/219 - 1s - 4ms/step - accuracy: 0.7959 - loss: 0.8899 - val_accuracy: 0.7783 - val_loss: 0.4619 Epoch 115/150 219/219 - 1s - 2ms/step - accuracy: 0.7933 - loss: 0.8879 - val_accuracy: 0.7872 - val_loss: 0.4505 Epoch 116/150 219/219 - 1s - 3ms/step - accuracy: 0.7973 - loss: 0.8865 - val_accuracy: 0.7744 - val_loss: 0.4647 Epoch 117/150 219/219 - 1s - 3ms/step - accuracy: 0.8001 - loss: 0.8862 - val_accuracy: 0.7889 - val_loss: 0.4413 Epoch 118/150 219/219 - 1s - 3ms/step - accuracy: 0.7987 - loss: 0.8857 - val_accuracy: 0.7822 - val_loss: 0.4492 Epoch 119/150 219/219 - 1s - 3ms/step - accuracy: 0.7967 - loss: 0.8829 - val_accuracy: 0.7767 - val_loss: 0.4697 Epoch 120/150 219/219 - 1s - 3ms/step - accuracy: 0.8020 - loss: 0.8830 - val_accuracy: 0.7750 - val_loss: 0.4688 Epoch 121/150 219/219 - 1s - 3ms/step - accuracy: 0.7967 - loss: 0.8805 - val_accuracy: 0.7806 - val_loss: 0.4590 Epoch 122/150 219/219 - 1s - 3ms/step - accuracy: 0.7987 - loss: 0.8792 - val_accuracy: 0.7839 - val_loss: 0.4446 Epoch 123/150 219/219 - 1s - 3ms/step - accuracy: 0.8004 - loss: 0.8792 - val_accuracy: 0.7833 - val_loss: 0.4495 Epoch 124/150 219/219 - 1s - 3ms/step - accuracy: 0.8043 - loss: 0.8779 - val_accuracy: 0.7583 - val_loss: 0.4845 Epoch 125/150 219/219 - 1s - 3ms/step - accuracy: 0.7957 - loss: 0.8774 - val_accuracy: 0.7961 - val_loss: 0.4337 Epoch 126/150 219/219 - 1s - 2ms/step - accuracy: 0.8021 - loss: 0.8752 - val_accuracy: 0.7950 - val_loss: 0.4379 Epoch 127/150 219/219 - 1s - 3ms/step - accuracy: 0.8007 - loss: 0.8754 - val_accuracy: 0.7678 - val_loss: 0.4722 Epoch 128/150 219/219 - 1s - 3ms/step - accuracy: 0.8043 - loss: 0.8735 - val_accuracy: 0.7694 - val_loss: 0.4737 Epoch 129/150 219/219 - 1s - 3ms/step - accuracy: 0.7997 - loss: 0.8722 - val_accuracy: 0.7750 - val_loss: 0.4662 Epoch 130/150 219/219 - 1s - 4ms/step - accuracy: 0.8051 - loss: 0.8727 - val_accuracy: 0.7856 - val_loss: 0.4472 Epoch 131/150 219/219 - 1s - 6ms/step - accuracy: 0.7994 - loss: 0.8713 - val_accuracy: 0.7983 - val_loss: 0.4344 Epoch 132/150 219/219 - 1s - 5ms/step - accuracy: 0.8079 - loss: 0.8711 - val_accuracy: 0.7706 - val_loss: 0.4701 Epoch 133/150 219/219 - 1s - 5ms/step - accuracy: 0.8019 - loss: 0.8679 - val_accuracy: 0.7789 - val_loss: 0.4662 Epoch 134/150 219/219 - 1s - 3ms/step - accuracy: 0.8060 - loss: 0.8673 - val_accuracy: 0.7922 - val_loss: 0.4428 Epoch 135/150 219/219 - 1s - 3ms/step - accuracy: 0.8044 - loss: 0.8664 - val_accuracy: 0.7950 - val_loss: 0.4361 Epoch 136/150 219/219 - 1s - 3ms/step - accuracy: 0.8061 - loss: 0.8655 - val_accuracy: 0.7817 - val_loss: 0.4577 Epoch 137/150 219/219 - 1s - 2ms/step - accuracy: 0.8070 - loss: 0.8640 - val_accuracy: 0.7339 - val_loss: 0.5309 Epoch 138/150 219/219 - 1s - 2ms/step - accuracy: 0.7996 - loss: 0.8642 - val_accuracy: 0.8078 - val_loss: 0.4172 Epoch 139/150 219/219 - 1s - 3ms/step - accuracy: 0.8073 - loss: 0.8633 - val_accuracy: 0.7783 - val_loss: 0.4574 Epoch 140/150 219/219 - 1s - 3ms/step - accuracy: 0.8047 - loss: 0.8630 - val_accuracy: 0.7828 - val_loss: 0.4503 Epoch 141/150 219/219 - 1s - 3ms/step - accuracy: 0.8063 - loss: 0.8617 - val_accuracy: 0.8056 - val_loss: 0.4248 Epoch 142/150 219/219 - 1s - 3ms/step - accuracy: 0.8060 - loss: 0.8607 - val_accuracy: 0.7778 - val_loss: 0.4622 Epoch 143/150 219/219 - 1s - 3ms/step - accuracy: 0.8049 - loss: 0.8597 - val_accuracy: 0.7978 - val_loss: 0.4295 Epoch 144/150 219/219 - 1s - 3ms/step - accuracy: 0.8061 - loss: 0.8598 - val_accuracy: 0.7961 - val_loss: 0.4357 Epoch 145/150 219/219 - 1s - 3ms/step - accuracy: 0.8069 - loss: 0.8577 - val_accuracy: 0.7794 - val_loss: 0.4618 Epoch 146/150 219/219 - 1s - 3ms/step - accuracy: 0.8054 - loss: 0.8580 - val_accuracy: 0.7683 - val_loss: 0.4699 Epoch 147/150 219/219 - 1s - 3ms/step - accuracy: 0.8069 - loss: 0.8558 - val_accuracy: 0.7911 - val_loss: 0.4408 Epoch 148/150 219/219 - 1s - 4ms/step - accuracy: 0.8094 - loss: 0.8566 - val_accuracy: 0.7956 - val_loss: 0.4377 Epoch 149/150 219/219 - 1s - 6ms/step - accuracy: 0.8070 - loss: 0.8567 - val_accuracy: 0.7839 - val_loss: 0.4536 Epoch 150/150 219/219 - 1s - 4ms/step - accuracy: 0.8107 - loss: 0.8531 - val_accuracy: 0.7678 - val_loss: 0.4780
#observing accuracy plot
plot(history8,'accuracy')
#observing loss
plot(history8,'loss')
#comparing with other models
results.loc[8] = [2,[64,32],['relu','tanh'],50,32,"sgd",[0.001,0.7],"xavier","-",history8.history["loss"][-1],history8.history["val_loss"][-1],history8.history["accuracy"][-1],history8.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
#model performance for training data
model_performance_classification(model8, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.790714 | 0.790714 | 0.853476 | 0.80747 |
#model performace for validation
model_performance_classification(model8, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.767778 | 0.767778 | 0.834837 | 0.785876 |
Observations
- Performance is not that good as previous.
Model 9
Let's design model with change in momentum and chage in batch size and neuron numbers
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
- learning rate=0.001,momemtum =0.4
- batch size=64
- epoch=100
#clear keras session
tf.keras.backend.clear_session()
#define model
model9 = Sequential()
model9.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model9.add(Dense(32,activation="tanh"))
model9.add(Dense(1,activation="sigmoid"))
#model summary
model9.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
# compile model
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001, momentum=0.4) # defining SGD as the optimizer to be used
model9.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training
start = time.time()
history9 = model9.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=100,class_weight=cw_dict, verbose=2,batch_size = 64)
Epoch 1/100 110/110 - 2s - 15ms/step - accuracy: 0.4793 - loss: 1.4018 - val_accuracy: 0.5250 - val_loss: 0.6966 Epoch 2/100 110/110 - 1s - 9ms/step - accuracy: 0.5714 - loss: 1.3622 - val_accuracy: 0.5933 - val_loss: 0.6714 Epoch 3/100 110/110 - 1s - 6ms/step - accuracy: 0.6271 - loss: 1.3307 - val_accuracy: 0.6472 - val_loss: 0.6527 Epoch 4/100 110/110 - 0s - 4ms/step - accuracy: 0.6659 - loss: 1.3048 - val_accuracy: 0.6733 - val_loss: 0.6391 Epoch 5/100 110/110 - 1s - 5ms/step - accuracy: 0.6876 - loss: 1.2829 - val_accuracy: 0.6900 - val_loss: 0.6298 Epoch 6/100 110/110 - 1s - 6ms/step - accuracy: 0.6910 - loss: 1.2646 - val_accuracy: 0.7083 - val_loss: 0.6193 Epoch 7/100 110/110 - 0s - 4ms/step - accuracy: 0.7039 - loss: 1.2490 - val_accuracy: 0.7133 - val_loss: 0.6130 Epoch 8/100 110/110 - 1s - 5ms/step - accuracy: 0.7096 - loss: 1.2358 - val_accuracy: 0.7094 - val_loss: 0.6092 Epoch 9/100 110/110 - 1s - 6ms/step - accuracy: 0.7046 - loss: 1.2246 - val_accuracy: 0.7150 - val_loss: 0.5997 Epoch 10/100 110/110 - 0s - 3ms/step - accuracy: 0.7129 - loss: 1.2150 - val_accuracy: 0.7128 - val_loss: 0.5954 Epoch 11/100 110/110 - 1s - 7ms/step - accuracy: 0.7137 - loss: 1.2069 - val_accuracy: 0.7144 - val_loss: 0.5909 Epoch 12/100 110/110 - 0s - 3ms/step - accuracy: 0.7119 - loss: 1.2000 - val_accuracy: 0.7167 - val_loss: 0.5860 Epoch 13/100 110/110 - 1s - 5ms/step - accuracy: 0.7151 - loss: 1.1938 - val_accuracy: 0.7194 - val_loss: 0.5835 Epoch 14/100 110/110 - 1s - 6ms/step - accuracy: 0.7151 - loss: 1.1885 - val_accuracy: 0.7189 - val_loss: 0.5824 Epoch 15/100 110/110 - 0s - 3ms/step - accuracy: 0.7116 - loss: 1.1839 - val_accuracy: 0.7217 - val_loss: 0.5772 Epoch 16/100 110/110 - 0s - 3ms/step - accuracy: 0.7149 - loss: 1.1798 - val_accuracy: 0.7233 - val_loss: 0.5760 Epoch 17/100 110/110 - 0s - 3ms/step - accuracy: 0.7120 - loss: 1.1762 - val_accuracy: 0.7250 - val_loss: 0.5722 Epoch 18/100 110/110 - 1s - 5ms/step - accuracy: 0.7137 - loss: 1.1729 - val_accuracy: 0.7233 - val_loss: 0.5727 Epoch 19/100 110/110 - 0s - 3ms/step - accuracy: 0.7170 - loss: 1.1698 - val_accuracy: 0.7200 - val_loss: 0.5756 Epoch 20/100 110/110 - 1s - 5ms/step - accuracy: 0.7137 - loss: 1.1670 - val_accuracy: 0.7222 - val_loss: 0.5726 Epoch 21/100 110/110 - 1s - 6ms/step - accuracy: 0.7129 - loss: 1.1645 - val_accuracy: 0.7239 - val_loss: 0.5708 Epoch 22/100 110/110 - 0s - 3ms/step - accuracy: 0.7136 - loss: 1.1622 - val_accuracy: 0.7239 - val_loss: 0.5687 Epoch 23/100 110/110 - 0s - 3ms/step - accuracy: 0.7146 - loss: 1.1600 - val_accuracy: 0.7222 - val_loss: 0.5673 Epoch 24/100 110/110 - 1s - 6ms/step - accuracy: 0.7156 - loss: 1.1581 - val_accuracy: 0.7222 - val_loss: 0.5674 Epoch 25/100 110/110 - 1s - 6ms/step - accuracy: 0.7170 - loss: 1.1559 - val_accuracy: 0.7217 - val_loss: 0.5672 Epoch 26/100 110/110 - 1s - 6ms/step - accuracy: 0.7103 - loss: 1.1541 - val_accuracy: 0.7239 - val_loss: 0.5620 Epoch 27/100 110/110 - 1s - 6ms/step - accuracy: 0.7144 - loss: 1.1523 - val_accuracy: 0.7244 - val_loss: 0.5602 Epoch 28/100 110/110 - 1s - 5ms/step - accuracy: 0.7163 - loss: 1.1506 - val_accuracy: 0.7222 - val_loss: 0.5611 Epoch 29/100 110/110 - 1s - 6ms/step - accuracy: 0.7144 - loss: 1.1489 - val_accuracy: 0.7217 - val_loss: 0.5619 Epoch 30/100 110/110 - 0s - 4ms/step - accuracy: 0.7126 - loss: 1.1474 - val_accuracy: 0.7217 - val_loss: 0.5585 Epoch 31/100 110/110 - 0s - 3ms/step - accuracy: 0.7160 - loss: 1.1458 - val_accuracy: 0.7189 - val_loss: 0.5612 Epoch 32/100 110/110 - 0s - 3ms/step - accuracy: 0.7151 - loss: 1.1445 - val_accuracy: 0.7200 - val_loss: 0.5613 Epoch 33/100 110/110 - 0s - 3ms/step - accuracy: 0.7130 - loss: 1.1431 - val_accuracy: 0.7211 - val_loss: 0.5576 Epoch 34/100 110/110 - 1s - 6ms/step - accuracy: 0.7157 - loss: 1.1416 - val_accuracy: 0.7228 - val_loss: 0.5577 Epoch 35/100 110/110 - 1s - 5ms/step - accuracy: 0.7141 - loss: 1.1404 - val_accuracy: 0.7206 - val_loss: 0.5587 Epoch 36/100 110/110 - 0s - 3ms/step - accuracy: 0.7113 - loss: 1.1391 - val_accuracy: 0.7244 - val_loss: 0.5528 Epoch 37/100 110/110 - 1s - 6ms/step - accuracy: 0.7170 - loss: 1.1379 - val_accuracy: 0.7256 - val_loss: 0.5541 Epoch 38/100 110/110 - 1s - 5ms/step - accuracy: 0.7163 - loss: 1.1367 - val_accuracy: 0.7217 - val_loss: 0.5573 Epoch 39/100 110/110 - 1s - 5ms/step - accuracy: 0.7114 - loss: 1.1356 - val_accuracy: 0.7283 - val_loss: 0.5513 Epoch 40/100 110/110 - 1s - 6ms/step - accuracy: 0.7150 - loss: 1.1344 - val_accuracy: 0.7272 - val_loss: 0.5518 Epoch 41/100 110/110 - 0s - 3ms/step - accuracy: 0.7181 - loss: 1.1332 - val_accuracy: 0.7167 - val_loss: 0.5571 Epoch 42/100 110/110 - 1s - 5ms/step - accuracy: 0.7139 - loss: 1.1324 - val_accuracy: 0.7144 - val_loss: 0.5573 Epoch 43/100 110/110 - 0s - 3ms/step - accuracy: 0.7126 - loss: 1.1313 - val_accuracy: 0.7183 - val_loss: 0.5545 Epoch 44/100 110/110 - 1s - 5ms/step - accuracy: 0.7090 - loss: 1.1302 - val_accuracy: 0.7278 - val_loss: 0.5491 Epoch 45/100 110/110 - 0s - 3ms/step - accuracy: 0.7134 - loss: 1.1294 - val_accuracy: 0.7272 - val_loss: 0.5484 Epoch 46/100 110/110 - 0s - 3ms/step - accuracy: 0.7140 - loss: 1.1284 - val_accuracy: 0.7250 - val_loss: 0.5489 Epoch 47/100 110/110 - 1s - 6ms/step - accuracy: 0.7143 - loss: 1.1275 - val_accuracy: 0.7217 - val_loss: 0.5502 Epoch 48/100 110/110 - 1s - 5ms/step - accuracy: 0.7131 - loss: 1.1265 - val_accuracy: 0.7211 - val_loss: 0.5505 Epoch 49/100 110/110 - 0s - 4ms/step - accuracy: 0.7113 - loss: 1.1256 - val_accuracy: 0.7244 - val_loss: 0.5478 Epoch 50/100 110/110 - 1s - 6ms/step - accuracy: 0.7159 - loss: 1.1248 - val_accuracy: 0.7189 - val_loss: 0.5513 Epoch 51/100 110/110 - 1s - 7ms/step - accuracy: 0.7137 - loss: 1.1239 - val_accuracy: 0.7217 - val_loss: 0.5490 Epoch 52/100 110/110 - 1s - 6ms/step - accuracy: 0.7171 - loss: 1.1229 - val_accuracy: 0.7167 - val_loss: 0.5518 Epoch 53/100 110/110 - 1s - 5ms/step - accuracy: 0.7130 - loss: 1.1221 - val_accuracy: 0.7194 - val_loss: 0.5492 Epoch 54/100 110/110 - 1s - 6ms/step - accuracy: 0.7136 - loss: 1.1212 - val_accuracy: 0.7206 - val_loss: 0.5484 Epoch 55/100 110/110 - 1s - 5ms/step - accuracy: 0.7141 - loss: 1.1204 - val_accuracy: 0.7189 - val_loss: 0.5488 Epoch 56/100 110/110 - 0s - 4ms/step - accuracy: 0.7106 - loss: 1.1195 - val_accuracy: 0.7217 - val_loss: 0.5464 Epoch 57/100 110/110 - 0s - 3ms/step - accuracy: 0.7129 - loss: 1.1187 - val_accuracy: 0.7211 - val_loss: 0.5461 Epoch 58/100 110/110 - 1s - 5ms/step - accuracy: 0.7119 - loss: 1.1180 - val_accuracy: 0.7222 - val_loss: 0.5451 Epoch 59/100 110/110 - 1s - 6ms/step - accuracy: 0.7139 - loss: 1.1171 - val_accuracy: 0.7172 - val_loss: 0.5481 Epoch 60/100 110/110 - 0s - 3ms/step - accuracy: 0.7126 - loss: 1.1163 - val_accuracy: 0.7167 - val_loss: 0.5476 Epoch 61/100 110/110 - 0s - 3ms/step - accuracy: 0.7100 - loss: 1.1155 - val_accuracy: 0.7217 - val_loss: 0.5435 Epoch 62/100 110/110 - 1s - 6ms/step - accuracy: 0.7166 - loss: 1.1146 - val_accuracy: 0.7133 - val_loss: 0.5490 Epoch 63/100 110/110 - 1s - 5ms/step - accuracy: 0.7084 - loss: 1.1139 - val_accuracy: 0.7200 - val_loss: 0.5438 Epoch 64/100 110/110 - 1s - 6ms/step - accuracy: 0.7136 - loss: 1.1131 - val_accuracy: 0.7150 - val_loss: 0.5468 Epoch 65/100 110/110 - 0s - 3ms/step - accuracy: 0.7136 - loss: 1.1123 - val_accuracy: 0.7111 - val_loss: 0.5493 Epoch 66/100 110/110 - 0s - 3ms/step - accuracy: 0.7109 - loss: 1.1116 - val_accuracy: 0.7128 - val_loss: 0.5480 Epoch 67/100 110/110 - 0s - 3ms/step - accuracy: 0.7094 - loss: 1.1108 - val_accuracy: 0.7156 - val_loss: 0.5455 Epoch 68/100 110/110 - 1s - 5ms/step - accuracy: 0.7101 - loss: 1.1101 - val_accuracy: 0.7156 - val_loss: 0.5449 Epoch 69/100 110/110 - 1s - 6ms/step - accuracy: 0.7123 - loss: 1.1093 - val_accuracy: 0.7156 - val_loss: 0.5447 Epoch 70/100 110/110 - 1s - 5ms/step - accuracy: 0.7119 - loss: 1.1085 - val_accuracy: 0.7150 - val_loss: 0.5449 Epoch 71/100 110/110 - 0s - 4ms/step - accuracy: 0.7094 - loss: 1.1077 - val_accuracy: 0.7178 - val_loss: 0.5415 Epoch 72/100 110/110 - 1s - 5ms/step - accuracy: 0.7101 - loss: 1.1069 - val_accuracy: 0.7172 - val_loss: 0.5422 Epoch 73/100 110/110 - 0s - 3ms/step - accuracy: 0.7127 - loss: 1.1062 - val_accuracy: 0.7133 - val_loss: 0.5464 Epoch 74/100 110/110 - 0s - 3ms/step - accuracy: 0.7103 - loss: 1.1054 - val_accuracy: 0.7128 - val_loss: 0.5466 Epoch 75/100 110/110 - 0s - 3ms/step - accuracy: 0.7096 - loss: 1.1046 - val_accuracy: 0.7139 - val_loss: 0.5438 Epoch 76/100 110/110 - 0s - 3ms/step - accuracy: 0.7086 - loss: 1.1039 - val_accuracy: 0.7167 - val_loss: 0.5409 Epoch 77/100 110/110 - 1s - 7ms/step - accuracy: 0.7084 - loss: 1.1030 - val_accuracy: 0.7211 - val_loss: 0.5365 Epoch 78/100 110/110 - 1s - 5ms/step - accuracy: 0.7089 - loss: 1.1023 - val_accuracy: 0.7211 - val_loss: 0.5350 Epoch 79/100 110/110 - 1s - 6ms/step - accuracy: 0.7111 - loss: 1.1017 - val_accuracy: 0.7222 - val_loss: 0.5360 Epoch 80/100 110/110 - 1s - 5ms/step - accuracy: 0.7140 - loss: 1.1009 - val_accuracy: 0.7133 - val_loss: 0.5424 Epoch 81/100 110/110 - 1s - 6ms/step - accuracy: 0.7116 - loss: 1.1001 - val_accuracy: 0.7128 - val_loss: 0.5457 Epoch 82/100 110/110 - 0s - 4ms/step - accuracy: 0.7087 - loss: 1.0994 - val_accuracy: 0.7139 - val_loss: 0.5430 Epoch 83/100 110/110 - 0s - 3ms/step - accuracy: 0.7097 - loss: 1.0986 - val_accuracy: 0.7172 - val_loss: 0.5391 Epoch 84/100 110/110 - 1s - 6ms/step - accuracy: 0.7106 - loss: 1.0978 - val_accuracy: 0.7194 - val_loss: 0.5355 Epoch 85/100 110/110 - 1s - 6ms/step - accuracy: 0.7156 - loss: 1.0971 - val_accuracy: 0.7150 - val_loss: 0.5415 Epoch 86/100 110/110 - 1s - 6ms/step - accuracy: 0.7101 - loss: 1.0963 - val_accuracy: 0.7183 - val_loss: 0.5367 Epoch 87/100 110/110 - 1s - 6ms/step - accuracy: 0.7133 - loss: 1.0956 - val_accuracy: 0.7156 - val_loss: 0.5391 Epoch 88/100 110/110 - 0s - 3ms/step - accuracy: 0.7100 - loss: 1.0947 - val_accuracy: 0.7183 - val_loss: 0.5359 Epoch 89/100 110/110 - 1s - 6ms/step - accuracy: 0.7124 - loss: 1.0940 - val_accuracy: 0.7183 - val_loss: 0.5344 Epoch 90/100 110/110 - 0s - 3ms/step - accuracy: 0.7146 - loss: 1.0934 - val_accuracy: 0.7139 - val_loss: 0.5384 Epoch 91/100 110/110 - 0s - 3ms/step - accuracy: 0.7153 - loss: 1.0924 - val_accuracy: 0.7117 - val_loss: 0.5426 Epoch 92/100 110/110 - 0s - 3ms/step - accuracy: 0.7116 - loss: 1.0919 - val_accuracy: 0.7167 - val_loss: 0.5364 Epoch 93/100 110/110 - 0s - 3ms/step - accuracy: 0.7114 - loss: 1.0910 - val_accuracy: 0.7222 - val_loss: 0.5305 Epoch 94/100 110/110 - 1s - 6ms/step - accuracy: 0.7161 - loss: 1.0904 - val_accuracy: 0.7167 - val_loss: 0.5347 Epoch 95/100 110/110 - 1s - 6ms/step - accuracy: 0.7156 - loss: 1.0895 - val_accuracy: 0.7156 - val_loss: 0.5387 Epoch 96/100 110/110 - 0s - 4ms/step - accuracy: 0.7101 - loss: 1.0886 - val_accuracy: 0.7189 - val_loss: 0.5323 Epoch 97/100 110/110 - 1s - 5ms/step - accuracy: 0.7161 - loss: 1.0880 - val_accuracy: 0.7200 - val_loss: 0.5310 Epoch 98/100 110/110 - 1s - 5ms/step - accuracy: 0.7194 - loss: 1.0871 - val_accuracy: 0.7156 - val_loss: 0.5384 Epoch 99/100 110/110 - 0s - 3ms/step - accuracy: 0.7157 - loss: 1.0865 - val_accuracy: 0.7161 - val_loss: 0.5355 Epoch 100/100 110/110 - 1s - 6ms/step - accuracy: 0.7144 - loss: 1.0858 - val_accuracy: 0.7217 - val_loss: 0.5309
#plot accuracy curve
plot(history9,'accuracy')
#observe Loss curve
plot(history9,'loss')
# comparing perfomance with other models
results.loc[9] = [2,[64,32],['relu','tanh'],100,64,"sgd",[0.001,0.4],"xavier","-",history9.history["loss"][-1],history9.history["val_loss"][-1],history9.history["accuracy"][-1],history9.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
#Observing model performance for training data
model_performance_classification(model9, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.720714 | 0.720714 | 0.80751 | 0.745386 |
#Observing model perfomance with validation data
model_performance_classification(model9, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.721667 | 0.721667 | 0.80694 | 0.745043 |
Observations
- Performance is not good as model 7.
- Accuracy reduced and loss increased for both training and validation.
Model 10
Let's design model with change in momentum and chage in batch size and neuron numbers
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
- learning rate=0.001,momemtum =0.9
- batch size=32
- Add dropout of 0.2
#clear keras session
tf.keras.backend.clear_session()
#define model parameters
model10 = Sequential()
model10.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model10.add(Dropout(0.2))
model10.add(Dense(32,activation="tanh"))
model10.add(Dense(1,activation="sigmoid"))
#model summary
model10.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout (Dropout) │ (None, 64) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
# compile model
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001, momentum=0.9) # defining SGD as the optimizer to be used
model10.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training
start = time.time()
history10 = model10.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=100,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/100 219/219 - 2s - 7ms/step - accuracy: 0.6033 - loss: 1.3192 - val_accuracy: 0.6672 - val_loss: 0.6256 Epoch 2/100 219/219 - 1s - 3ms/step - accuracy: 0.6794 - loss: 1.2207 - val_accuracy: 0.7161 - val_loss: 0.5776 Epoch 3/100 219/219 - 1s - 3ms/step - accuracy: 0.7007 - loss: 1.1931 - val_accuracy: 0.7006 - val_loss: 0.5933 Epoch 4/100 219/219 - 1s - 3ms/step - accuracy: 0.7074 - loss: 1.1708 - val_accuracy: 0.7139 - val_loss: 0.5785 Epoch 5/100 219/219 - 1s - 3ms/step - accuracy: 0.7109 - loss: 1.1602 - val_accuracy: 0.6928 - val_loss: 0.5926 Epoch 6/100 219/219 - 1s - 2ms/step - accuracy: 0.7011 - loss: 1.1499 - val_accuracy: 0.7322 - val_loss: 0.5484 Epoch 7/100 219/219 - 1s - 3ms/step - accuracy: 0.7061 - loss: 1.1459 - val_accuracy: 0.7378 - val_loss: 0.5394 Epoch 8/100 219/219 - 1s - 3ms/step - accuracy: 0.7056 - loss: 1.1382 - val_accuracy: 0.7417 - val_loss: 0.5314 Epoch 9/100 219/219 - 1s - 3ms/step - accuracy: 0.7126 - loss: 1.1230 - val_accuracy: 0.7478 - val_loss: 0.5189 Epoch 10/100 219/219 - 1s - 3ms/step - accuracy: 0.7164 - loss: 1.1185 - val_accuracy: 0.7356 - val_loss: 0.5267 Epoch 11/100 219/219 - 1s - 3ms/step - accuracy: 0.7223 - loss: 1.1037 - val_accuracy: 0.7317 - val_loss: 0.5275 Epoch 12/100 219/219 - 1s - 3ms/step - accuracy: 0.7221 - loss: 1.1065 - val_accuracy: 0.7222 - val_loss: 0.5416 Epoch 13/100 219/219 - 1s - 5ms/step - accuracy: 0.7227 - loss: 1.0954 - val_accuracy: 0.7344 - val_loss: 0.5270 Epoch 14/100 219/219 - 1s - 5ms/step - accuracy: 0.7214 - loss: 1.0918 - val_accuracy: 0.7450 - val_loss: 0.5064 Epoch 15/100 219/219 - 1s - 6ms/step - accuracy: 0.7307 - loss: 1.0816 - val_accuracy: 0.7356 - val_loss: 0.5208 Epoch 16/100 219/219 - 1s - 5ms/step - accuracy: 0.7283 - loss: 1.0783 - val_accuracy: 0.7400 - val_loss: 0.5143 Epoch 17/100 219/219 - 1s - 3ms/step - accuracy: 0.7330 - loss: 1.0708 - val_accuracy: 0.7489 - val_loss: 0.5038 Epoch 18/100 219/219 - 1s - 3ms/step - accuracy: 0.7366 - loss: 1.0637 - val_accuracy: 0.7478 - val_loss: 0.5066 Epoch 19/100 219/219 - 1s - 3ms/step - accuracy: 0.7409 - loss: 1.0538 - val_accuracy: 0.7433 - val_loss: 0.5083 Epoch 20/100 219/219 - 1s - 2ms/step - accuracy: 0.7443 - loss: 1.0442 - val_accuracy: 0.7383 - val_loss: 0.5154 Epoch 21/100 219/219 - 1s - 3ms/step - accuracy: 0.7409 - loss: 1.0473 - val_accuracy: 0.7500 - val_loss: 0.5067 Epoch 22/100 219/219 - 1s - 3ms/step - accuracy: 0.7389 - loss: 1.0471 - val_accuracy: 0.7622 - val_loss: 0.4872 Epoch 23/100 219/219 - 1s - 3ms/step - accuracy: 0.7503 - loss: 1.0242 - val_accuracy: 0.7306 - val_loss: 0.5335 Epoch 24/100 219/219 - 1s - 3ms/step - accuracy: 0.7463 - loss: 1.0288 - val_accuracy: 0.7533 - val_loss: 0.5024 Epoch 25/100 219/219 - 1s - 3ms/step - accuracy: 0.7504 - loss: 1.0143 - val_accuracy: 0.7728 - val_loss: 0.4807 Epoch 26/100 219/219 - 1s - 3ms/step - accuracy: 0.7574 - loss: 1.0138 - val_accuracy: 0.7656 - val_loss: 0.4897 Epoch 27/100 219/219 - 1s - 3ms/step - accuracy: 0.7573 - loss: 1.0131 - val_accuracy: 0.7639 - val_loss: 0.4915 Epoch 28/100 219/219 - 1s - 2ms/step - accuracy: 0.7647 - loss: 1.0011 - val_accuracy: 0.7444 - val_loss: 0.5151 Epoch 29/100 219/219 - 1s - 3ms/step - accuracy: 0.7571 - loss: 1.0053 - val_accuracy: 0.7733 - val_loss: 0.4769 Epoch 30/100 219/219 - 1s - 3ms/step - accuracy: 0.7636 - loss: 0.9985 - val_accuracy: 0.7617 - val_loss: 0.4884 Epoch 31/100 219/219 - 1s - 2ms/step - accuracy: 0.7661 - loss: 0.9918 - val_accuracy: 0.7639 - val_loss: 0.4892 Epoch 32/100 219/219 - 1s - 5ms/step - accuracy: 0.7657 - loss: 0.9945 - val_accuracy: 0.7472 - val_loss: 0.5108 Epoch 33/100 219/219 - 1s - 5ms/step - accuracy: 0.7681 - loss: 0.9882 - val_accuracy: 0.7522 - val_loss: 0.4971 Epoch 34/100 219/219 - 1s - 6ms/step - accuracy: 0.7684 - loss: 0.9805 - val_accuracy: 0.7622 - val_loss: 0.4823 Epoch 35/100 219/219 - 1s - 3ms/step - accuracy: 0.7724 - loss: 0.9757 - val_accuracy: 0.7583 - val_loss: 0.4817 Epoch 36/100 219/219 - 1s - 3ms/step - accuracy: 0.7716 - loss: 0.9772 - val_accuracy: 0.7589 - val_loss: 0.4942 Epoch 37/100 219/219 - 1s - 3ms/step - accuracy: 0.7780 - loss: 0.9631 - val_accuracy: 0.7556 - val_loss: 0.4930 Epoch 38/100 219/219 - 2s - 8ms/step - accuracy: 0.7766 - loss: 0.9654 - val_accuracy: 0.7794 - val_loss: 0.4548 Epoch 39/100 219/219 - 1s - 6ms/step - accuracy: 0.7721 - loss: 0.9663 - val_accuracy: 0.7844 - val_loss: 0.4531 Epoch 40/100 219/219 - 1s - 3ms/step - accuracy: 0.7784 - loss: 0.9680 - val_accuracy: 0.7733 - val_loss: 0.4699 Epoch 41/100 219/219 - 1s - 3ms/step - accuracy: 0.7789 - loss: 0.9570 - val_accuracy: 0.7822 - val_loss: 0.4555 Epoch 42/100 219/219 - 1s - 3ms/step - accuracy: 0.7850 - loss: 0.9497 - val_accuracy: 0.7578 - val_loss: 0.4945 Epoch 43/100 219/219 - 1s - 3ms/step - accuracy: 0.7860 - loss: 0.9468 - val_accuracy: 0.7817 - val_loss: 0.4568 Epoch 44/100 219/219 - 1s - 3ms/step - accuracy: 0.7816 - loss: 0.9478 - val_accuracy: 0.7694 - val_loss: 0.4767 Epoch 45/100 219/219 - 1s - 3ms/step - accuracy: 0.7800 - loss: 0.9456 - val_accuracy: 0.7822 - val_loss: 0.4510 Epoch 46/100 219/219 - 1s - 3ms/step - accuracy: 0.7874 - loss: 0.9470 - val_accuracy: 0.7717 - val_loss: 0.4603 Epoch 47/100 219/219 - 1s - 3ms/step - accuracy: 0.7881 - loss: 0.9442 - val_accuracy: 0.7733 - val_loss: 0.4666 Epoch 48/100 219/219 - 1s - 4ms/step - accuracy: 0.7899 - loss: 0.9418 - val_accuracy: 0.7656 - val_loss: 0.4721 Epoch 49/100 219/219 - 1s - 4ms/step - accuracy: 0.7907 - loss: 0.9304 - val_accuracy: 0.7678 - val_loss: 0.4753 Epoch 50/100 219/219 - 1s - 6ms/step - accuracy: 0.7909 - loss: 0.9344 - val_accuracy: 0.7567 - val_loss: 0.4858 Epoch 51/100 219/219 - 1s - 4ms/step - accuracy: 0.7971 - loss: 0.9168 - val_accuracy: 0.7800 - val_loss: 0.4535 Epoch 52/100 219/219 - 1s - 3ms/step - accuracy: 0.7880 - loss: 0.9256 - val_accuracy: 0.7806 - val_loss: 0.4491 Epoch 53/100 219/219 - 1s - 3ms/step - accuracy: 0.7894 - loss: 0.9280 - val_accuracy: 0.7556 - val_loss: 0.4849 Epoch 54/100 219/219 - 1s - 3ms/step - accuracy: 0.7870 - loss: 0.9348 - val_accuracy: 0.7800 - val_loss: 0.4556 Epoch 55/100 219/219 - 1s - 3ms/step - accuracy: 0.7886 - loss: 0.9266 - val_accuracy: 0.7744 - val_loss: 0.4629 Epoch 56/100 219/219 - 1s - 3ms/step - accuracy: 0.7917 - loss: 0.9270 - val_accuracy: 0.7761 - val_loss: 0.4602 Epoch 57/100 219/219 - 1s - 3ms/step - accuracy: 0.7907 - loss: 0.9162 - val_accuracy: 0.7650 - val_loss: 0.4716 Epoch 58/100 219/219 - 1s - 3ms/step - accuracy: 0.7907 - loss: 0.9173 - val_accuracy: 0.7778 - val_loss: 0.4497 Epoch 59/100 219/219 - 1s - 3ms/step - accuracy: 0.7956 - loss: 0.9157 - val_accuracy: 0.7672 - val_loss: 0.4712 Epoch 60/100 219/219 - 1s - 3ms/step - accuracy: 0.7936 - loss: 0.9232 - val_accuracy: 0.7789 - val_loss: 0.4489 Epoch 61/100 219/219 - 1s - 3ms/step - accuracy: 0.7907 - loss: 0.9255 - val_accuracy: 0.7806 - val_loss: 0.4523 Epoch 62/100 219/219 - 1s - 3ms/step - accuracy: 0.7933 - loss: 0.9161 - val_accuracy: 0.7900 - val_loss: 0.4297 Epoch 63/100 219/219 - 1s - 3ms/step - accuracy: 0.7933 - loss: 0.9196 - val_accuracy: 0.7800 - val_loss: 0.4559 Epoch 64/100 219/219 - 1s - 3ms/step - accuracy: 0.7964 - loss: 0.9046 - val_accuracy: 0.7867 - val_loss: 0.4358 Epoch 65/100 219/219 - 1s - 3ms/step - accuracy: 0.7947 - loss: 0.9046 - val_accuracy: 0.7806 - val_loss: 0.4499 Epoch 66/100 219/219 - 1s - 2ms/step - accuracy: 0.7977 - loss: 0.9116 - val_accuracy: 0.7700 - val_loss: 0.4580 Epoch 67/100 219/219 - 1s - 5ms/step - accuracy: 0.7937 - loss: 0.9215 - val_accuracy: 0.7739 - val_loss: 0.4584 Epoch 68/100 219/219 - 1s - 5ms/step - accuracy: 0.7900 - loss: 0.9145 - val_accuracy: 0.7883 - val_loss: 0.4414 Epoch 69/100 219/219 - 1s - 6ms/step - accuracy: 0.7966 - loss: 0.9137 - val_accuracy: 0.7800 - val_loss: 0.4514 Epoch 70/100 219/219 - 1s - 5ms/step - accuracy: 0.7900 - loss: 0.9191 - val_accuracy: 0.7906 - val_loss: 0.4339 Epoch 71/100 219/219 - 1s - 3ms/step - accuracy: 0.7991 - loss: 0.9092 - val_accuracy: 0.7656 - val_loss: 0.4778 Epoch 72/100 219/219 - 1s - 3ms/step - accuracy: 0.8017 - loss: 0.9006 - val_accuracy: 0.7933 - val_loss: 0.4342 Epoch 73/100 219/219 - 1s - 3ms/step - accuracy: 0.8007 - loss: 0.9071 - val_accuracy: 0.7783 - val_loss: 0.4473 Epoch 74/100 219/219 - 1s - 3ms/step - accuracy: 0.8030 - loss: 0.9103 - val_accuracy: 0.7778 - val_loss: 0.4587 Epoch 75/100 219/219 - 1s - 3ms/step - accuracy: 0.7943 - loss: 0.9082 - val_accuracy: 0.7656 - val_loss: 0.4728 Epoch 76/100 219/219 - 1s - 3ms/step - accuracy: 0.8043 - loss: 0.8992 - val_accuracy: 0.7667 - val_loss: 0.4728 Epoch 77/100 219/219 - 1s - 3ms/step - accuracy: 0.7946 - loss: 0.9055 - val_accuracy: 0.7889 - val_loss: 0.4314 Epoch 78/100 219/219 - 1s - 3ms/step - accuracy: 0.7941 - loss: 0.8997 - val_accuracy: 0.7844 - val_loss: 0.4436 Epoch 79/100 219/219 - 1s - 3ms/step - accuracy: 0.8010 - loss: 0.9071 - val_accuracy: 0.7989 - val_loss: 0.4284 Epoch 80/100 219/219 - 1s - 3ms/step - accuracy: 0.7990 - loss: 0.9046 - val_accuracy: 0.7900 - val_loss: 0.4365 Epoch 81/100 219/219 - 1s - 3ms/step - accuracy: 0.7989 - loss: 0.8970 - val_accuracy: 0.8011 - val_loss: 0.4256 Epoch 82/100 219/219 - 1s - 3ms/step - accuracy: 0.7963 - loss: 0.9103 - val_accuracy: 0.7761 - val_loss: 0.4581 Epoch 83/100 219/219 - 1s - 3ms/step - accuracy: 0.7991 - loss: 0.8914 - val_accuracy: 0.7956 - val_loss: 0.4316 Epoch 84/100 219/219 - 1s - 3ms/step - accuracy: 0.8000 - loss: 0.9059 - val_accuracy: 0.7928 - val_loss: 0.4429 Epoch 85/100 219/219 - 1s - 3ms/step - accuracy: 0.8036 - loss: 0.8985 - val_accuracy: 0.7656 - val_loss: 0.4696 Epoch 86/100 219/219 - 1s - 5ms/step - accuracy: 0.7994 - loss: 0.9073 - val_accuracy: 0.7700 - val_loss: 0.4662 Epoch 87/100 219/219 - 1s - 5ms/step - accuracy: 0.7991 - loss: 0.9033 - val_accuracy: 0.7811 - val_loss: 0.4520 Epoch 88/100 219/219 - 1s - 6ms/step - accuracy: 0.7951 - loss: 0.9060 - val_accuracy: 0.7839 - val_loss: 0.4489 Epoch 89/100 219/219 - 1s - 5ms/step - accuracy: 0.8000 - loss: 0.8986 - val_accuracy: 0.7889 - val_loss: 0.4437 Epoch 90/100 219/219 - 1s - 6ms/step - accuracy: 0.8046 - loss: 0.8904 - val_accuracy: 0.7856 - val_loss: 0.4460 Epoch 91/100 219/219 - 1s - 3ms/step - accuracy: 0.8026 - loss: 0.8976 - val_accuracy: 0.7922 - val_loss: 0.4407 Epoch 92/100 219/219 - 1s - 3ms/step - accuracy: 0.8011 - loss: 0.8981 - val_accuracy: 0.7672 - val_loss: 0.4770 Epoch 93/100 219/219 - 1s - 3ms/step - accuracy: 0.8024 - loss: 0.8880 - val_accuracy: 0.7344 - val_loss: 0.5224 Epoch 94/100 219/219 - 1s - 3ms/step - accuracy: 0.7994 - loss: 0.8920 - val_accuracy: 0.7600 - val_loss: 0.4803 Epoch 95/100 219/219 - 1s - 3ms/step - accuracy: 0.8003 - loss: 0.8965 - val_accuracy: 0.7861 - val_loss: 0.4440 Epoch 96/100 219/219 - 1s - 3ms/step - accuracy: 0.8023 - loss: 0.8932 - val_accuracy: 0.7672 - val_loss: 0.4685 Epoch 97/100 219/219 - 1s - 3ms/step - accuracy: 0.7983 - loss: 0.8965 - val_accuracy: 0.7850 - val_loss: 0.4501 Epoch 98/100 219/219 - 1s - 3ms/step - accuracy: 0.7994 - loss: 0.8835 - val_accuracy: 0.7761 - val_loss: 0.4536 Epoch 99/100 219/219 - 1s - 3ms/step - accuracy: 0.8069 - loss: 0.8896 - val_accuracy: 0.7689 - val_loss: 0.4715 Epoch 100/100 219/219 - 1s - 3ms/step - accuracy: 0.8049 - loss: 0.8929 - val_accuracy: 0.7761 - val_loss: 0.4563
#plot accuracy curve
plot(history10,'accuracy')
#plot loss
plot(history10,'loss')
#Comparing all models
results.loc[10] = [2,[64,32],['relu','tanh'],100,32,"sgd",[0.001,0.9],"xavier",["dropout:0.2"],history10.history["loss"][-1],history10.history["val_loss"][-1],history10.history["accuracy"][-1],history10.history["val_accuracy"][-1],round(end - start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
Observation
- This model,Loss is slightly reduced but also Accuracy than model 7 and 3.
Model 11
Let's design model with change in momentum and chage in batch size and neuron numbers
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- Stochastic Gradient Descent (SGD)
- learning rate=0.001,momemtum =0.9
- batch size=32
- Add dropout of 0.2
- add batchnormalization
#clear keras
tf.keras.backend.clear_session()
#define model with batch normalization,dropout layer
model11 = Sequential()
model11.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model11.add(BatchNormalization())
model11.add(Dropout(0.2))
model11.add(Dense(32,activation="tanh"))
model11.add(Dense(1,activation="sigmoid"))
#model summary
model11.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ batch_normalization │ (None, 64) │ 256 │ │ (BatchNormalization) │ │ │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout (Dropout) │ (None, 64) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 3,137 (12.25 KB)
Trainable params: 3,009 (11.75 KB)
Non-trainable params: 128 (512.00 B)
#compile model
optimizer = tf.keras.optimizers.SGD(learning_rate=0.001, momentum=0.9) # defining SGD as the optimizer to be used
model11.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training
start = time.time()
history11 = model11.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=1,batch_size = 32)
Epoch 1/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 4ms/step - accuracy: 0.5766 - loss: 1.3942 - val_accuracy: 0.7456 - val_loss: 0.5436 Epoch 2/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.6954 - loss: 1.1787 - val_accuracy: 0.7311 - val_loss: 0.5320 Epoch 3/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7006 - loss: 1.1395 - val_accuracy: 0.7211 - val_loss: 0.5370 Epoch 4/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7140 - loss: 1.1366 - val_accuracy: 0.7200 - val_loss: 0.5373 Epoch 5/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7108 - loss: 1.1074 - val_accuracy: 0.7311 - val_loss: 0.5196 Epoch 6/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.7152 - loss: 1.1208 - val_accuracy: 0.7500 - val_loss: 0.5094 Epoch 7/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7186 - loss: 1.1064 - val_accuracy: 0.7333 - val_loss: 0.5186 Epoch 8/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7387 - loss: 1.0665 - val_accuracy: 0.7233 - val_loss: 0.5297 Epoch 9/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7264 - loss: 1.0753 - val_accuracy: 0.7522 - val_loss: 0.4965 Epoch 10/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7462 - loss: 1.0246 - val_accuracy: 0.7661 - val_loss: 0.4888 Epoch 11/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7477 - loss: 1.0531 - val_accuracy: 0.7467 - val_loss: 0.5082 Epoch 12/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7505 - loss: 1.0023 - val_accuracy: 0.7506 - val_loss: 0.4932 Epoch 13/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 7ms/step - accuracy: 0.7511 - loss: 0.9912 - val_accuracy: 0.7633 - val_loss: 0.4753 Epoch 14/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 4s 14ms/step - accuracy: 0.7539 - loss: 1.0015 - val_accuracy: 0.7539 - val_loss: 0.4820 Epoch 15/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 4s 6ms/step - accuracy: 0.7609 - loss: 1.0009 - val_accuracy: 0.7644 - val_loss: 0.4751 Epoch 16/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7672 - loss: 0.9518 - val_accuracy: 0.7522 - val_loss: 0.4821 Epoch 17/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 3s 6ms/step - accuracy: 0.7807 - loss: 0.9398 - val_accuracy: 0.7767 - val_loss: 0.4535 Epoch 18/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - accuracy: 0.7689 - loss: 0.9592 - val_accuracy: 0.7633 - val_loss: 0.4645 Epoch 19/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - accuracy: 0.7733 - loss: 0.9297 - val_accuracy: 0.7778 - val_loss: 0.4509 Epoch 20/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 3s 9ms/step - accuracy: 0.7834 - loss: 0.9129 - val_accuracy: 0.7683 - val_loss: 0.4563 Epoch 21/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 9ms/step - accuracy: 0.7760 - loss: 0.9349 - val_accuracy: 0.7489 - val_loss: 0.4887 Epoch 22/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 6ms/step - accuracy: 0.7817 - loss: 0.9292 - val_accuracy: 0.7728 - val_loss: 0.4521 Epoch 23/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 5ms/step - accuracy: 0.7811 - loss: 0.9309 - val_accuracy: 0.7717 - val_loss: 0.4535 Epoch 24/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7866 - loss: 0.8995 - val_accuracy: 0.7711 - val_loss: 0.4532 Epoch 25/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - accuracy: 0.7788 - loss: 0.9054 - val_accuracy: 0.7794 - val_loss: 0.4489 Epoch 26/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 3s 6ms/step - accuracy: 0.7757 - loss: 0.9299 - val_accuracy: 0.7806 - val_loss: 0.4475 Epoch 27/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 3s 9ms/step - accuracy: 0.7745 - loss: 0.9306 - val_accuracy: 0.7778 - val_loss: 0.4434 Epoch 28/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7801 - loss: 0.9168 - val_accuracy: 0.7811 - val_loss: 0.4403 Epoch 29/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7860 - loss: 0.8947 - val_accuracy: 0.7822 - val_loss: 0.4493 Epoch 30/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7950 - loss: 0.8816 - val_accuracy: 0.7706 - val_loss: 0.4544 Epoch 31/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7777 - loss: 0.9551 - val_accuracy: 0.7928 - val_loss: 0.4320 Epoch 32/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7911 - loss: 0.9023 - val_accuracy: 0.7833 - val_loss: 0.4367 Epoch 33/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7883 - loss: 0.8976 - val_accuracy: 0.7783 - val_loss: 0.4406 Epoch 34/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7890 - loss: 0.9118 - val_accuracy: 0.7639 - val_loss: 0.4545 Epoch 35/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7888 - loss: 0.9268 - val_accuracy: 0.7694 - val_loss: 0.4563 Epoch 36/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7839 - loss: 0.9067 - val_accuracy: 0.7739 - val_loss: 0.4534 Epoch 37/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7946 - loss: 0.8909 - val_accuracy: 0.7778 - val_loss: 0.4456 Epoch 38/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7997 - loss: 0.8872 - val_accuracy: 0.7872 - val_loss: 0.4406 Epoch 39/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7980 - loss: 0.9021 - val_accuracy: 0.7828 - val_loss: 0.4424 Epoch 40/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7925 - loss: 0.8967 - val_accuracy: 0.7839 - val_loss: 0.4407 Epoch 41/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.8009 - loss: 0.8633 - val_accuracy: 0.7861 - val_loss: 0.4359 Epoch 42/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7998 - loss: 0.8893 - val_accuracy: 0.7794 - val_loss: 0.4497 Epoch 43/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7846 - loss: 0.9065 - val_accuracy: 0.7811 - val_loss: 0.4473 Epoch 44/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7988 - loss: 0.8927 - val_accuracy: 0.7867 - val_loss: 0.4488 Epoch 45/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7968 - loss: 0.8897 - val_accuracy: 0.7900 - val_loss: 0.4425 Epoch 46/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7987 - loss: 0.8605 - val_accuracy: 0.7900 - val_loss: 0.4373 Epoch 47/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.7969 - loss: 0.8792 - val_accuracy: 0.7861 - val_loss: 0.4465 Epoch 48/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 7ms/step - accuracy: 0.7994 - loss: 0.8641 - val_accuracy: 0.7772 - val_loss: 0.4605 Epoch 49/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 3ms/step - accuracy: 0.7803 - loss: 0.9022 - val_accuracy: 0.7928 - val_loss: 0.4356 Epoch 50/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 7ms/step - accuracy: 0.7880 - loss: 0.8979 - val_accuracy: 0.7861 - val_loss: 0.4417 Epoch 51/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 3s 9ms/step - accuracy: 0.7918 - loss: 0.8878 - val_accuracy: 0.7828 - val_loss: 0.4436 Epoch 52/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - accuracy: 0.7934 - loss: 0.8870 - val_accuracy: 0.7911 - val_loss: 0.4453 Epoch 53/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 4s 11ms/step - accuracy: 0.7952 - loss: 0.9106 - val_accuracy: 0.7917 - val_loss: 0.4403 Epoch 54/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 8ms/step - accuracy: 0.8020 - loss: 0.8545 - val_accuracy: 0.7900 - val_loss: 0.4375 Epoch 55/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 6ms/step - accuracy: 0.8092 - loss: 0.8541 - val_accuracy: 0.7872 - val_loss: 0.4454 Epoch 56/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 4s 11ms/step - accuracy: 0.7943 - loss: 0.8682 - val_accuracy: 0.7856 - val_loss: 0.4502 Epoch 57/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 10ms/step - accuracy: 0.7889 - loss: 0.8922 - val_accuracy: 0.7900 - val_loss: 0.4446 Epoch 58/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 6ms/step - accuracy: 0.7933 - loss: 0.8880 - val_accuracy: 0.7933 - val_loss: 0.4381 Epoch 59/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - accuracy: 0.8028 - loss: 0.8541 - val_accuracy: 0.7872 - val_loss: 0.4486 Epoch 60/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 6ms/step - accuracy: 0.8134 - loss: 0.8492 - val_accuracy: 0.7889 - val_loss: 0.4434 Epoch 61/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 6ms/step - accuracy: 0.8059 - loss: 0.8445 - val_accuracy: 0.7983 - val_loss: 0.4324 Epoch 62/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 3ms/step - accuracy: 0.7960 - loss: 0.8574 - val_accuracy: 0.8006 - val_loss: 0.4368 Epoch 63/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 4ms/step - accuracy: 0.8176 - loss: 0.8689 - val_accuracy: 0.7850 - val_loss: 0.4476 Epoch 64/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.7919 - loss: 0.8803 - val_accuracy: 0.7989 - val_loss: 0.4416 Epoch 65/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 6ms/step - accuracy: 0.8045 - loss: 0.8685 - val_accuracy: 0.8006 - val_loss: 0.4324 Epoch 66/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7970 - loss: 0.8689 - val_accuracy: 0.7994 - val_loss: 0.4310 Epoch 67/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7947 - loss: 0.8812 - val_accuracy: 0.7978 - val_loss: 0.4405 Epoch 68/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8009 - loss: 0.8847 - val_accuracy: 0.7967 - val_loss: 0.4369 Epoch 69/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8042 - loss: 0.8677 - val_accuracy: 0.7967 - val_loss: 0.4324 Epoch 70/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8023 - loss: 0.8867 - val_accuracy: 0.7983 - val_loss: 0.4348 Epoch 71/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8093 - loss: 0.8522 - val_accuracy: 0.7978 - val_loss: 0.4422 Epoch 72/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8030 - loss: 0.8482 - val_accuracy: 0.7906 - val_loss: 0.4480 Epoch 73/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8043 - loss: 0.8413 - val_accuracy: 0.7906 - val_loss: 0.4430 Epoch 74/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8005 - loss: 0.8931 - val_accuracy: 0.8017 - val_loss: 0.4368 Epoch 75/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8006 - loss: 0.8622 - val_accuracy: 0.7933 - val_loss: 0.4370 Epoch 76/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 4ms/step - accuracy: 0.8076 - loss: 0.8297 - val_accuracy: 0.7844 - val_loss: 0.4511 Epoch 77/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.7893 - loss: 0.9066 - val_accuracy: 0.7928 - val_loss: 0.4365 Epoch 78/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.8097 - loss: 0.8565 - val_accuracy: 0.7811 - val_loss: 0.4582 Epoch 79/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8026 - loss: 0.8570 - val_accuracy: 0.7922 - val_loss: 0.4400 Epoch 80/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8093 - loss: 0.8647 - val_accuracy: 0.7811 - val_loss: 0.4499 Epoch 81/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7949 - loss: 0.8694 - val_accuracy: 0.7933 - val_loss: 0.4351 Epoch 82/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8108 - loss: 0.8266 - val_accuracy: 0.7872 - val_loss: 0.4550 Epoch 83/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8103 - loss: 0.8397 - val_accuracy: 0.7817 - val_loss: 0.4556 Epoch 84/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8059 - loss: 0.8378 - val_accuracy: 0.7833 - val_loss: 0.4499 Epoch 85/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8062 - loss: 0.8470 - val_accuracy: 0.7917 - val_loss: 0.4438 Epoch 86/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8082 - loss: 0.8665 - val_accuracy: 0.7833 - val_loss: 0.4560 Epoch 87/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7943 - loss: 0.8869 - val_accuracy: 0.8017 - val_loss: 0.4320 Epoch 88/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8183 - loss: 0.8421 - val_accuracy: 0.7933 - val_loss: 0.4506 Epoch 89/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8007 - loss: 0.8798 - val_accuracy: 0.8011 - val_loss: 0.4376 Epoch 90/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8142 - loss: 0.8464 - val_accuracy: 0.8028 - val_loss: 0.4357 Epoch 91/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8093 - loss: 0.8200 - val_accuracy: 0.7933 - val_loss: 0.4495 Epoch 92/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8043 - loss: 0.8312 - val_accuracy: 0.7789 - val_loss: 0.4610 Epoch 93/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7966 - loss: 0.8353 - val_accuracy: 0.7967 - val_loss: 0.4448 Epoch 94/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8082 - loss: 0.8235 - val_accuracy: 0.7922 - val_loss: 0.4497 Epoch 95/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8056 - loss: 0.8282 - val_accuracy: 0.7894 - val_loss: 0.4426 Epoch 96/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8031 - loss: 0.8687 - val_accuracy: 0.7844 - val_loss: 0.4494 Epoch 97/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8066 - loss: 0.8417 - val_accuracy: 0.7983 - val_loss: 0.4400 Epoch 98/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8043 - loss: 0.8518 - val_accuracy: 0.7944 - val_loss: 0.4502 Epoch 99/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8115 - loss: 0.8378 - val_accuracy: 0.7906 - val_loss: 0.4488 Epoch 100/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8164 - loss: 0.8208 - val_accuracy: 0.7894 - val_loss: 0.4550 Epoch 101/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 5ms/step - accuracy: 0.8054 - loss: 0.8502 - val_accuracy: 0.7972 - val_loss: 0.4416 Epoch 102/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8114 - loss: 0.8298 - val_accuracy: 0.7983 - val_loss: 0.4449 Epoch 103/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.8099 - loss: 0.8643 - val_accuracy: 0.7994 - val_loss: 0.4379 Epoch 104/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8085 - loss: 0.8363 - val_accuracy: 0.7778 - val_loss: 0.4566 Epoch 105/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8030 - loss: 0.8353 - val_accuracy: 0.7900 - val_loss: 0.4478 Epoch 106/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8187 - loss: 0.8158 - val_accuracy: 0.7917 - val_loss: 0.4502 Epoch 107/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8183 - loss: 0.7933 - val_accuracy: 0.7822 - val_loss: 0.4612 Epoch 108/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8113 - loss: 0.8233 - val_accuracy: 0.7806 - val_loss: 0.4691 Epoch 109/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8168 - loss: 0.8194 - val_accuracy: 0.7967 - val_loss: 0.4330 Epoch 110/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8168 - loss: 0.8008 - val_accuracy: 0.7767 - val_loss: 0.4612 Epoch 111/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.7975 - loss: 0.8337 - val_accuracy: 0.7939 - val_loss: 0.4471 Epoch 112/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 4ms/step - accuracy: 0.8074 - loss: 0.8217 - val_accuracy: 0.7939 - val_loss: 0.4467 Epoch 113/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8152 - loss: 0.8343 - val_accuracy: 0.7822 - val_loss: 0.4615 Epoch 114/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8173 - loss: 0.7991 - val_accuracy: 0.7883 - val_loss: 0.4436 Epoch 115/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.8251 - loss: 0.7770 - val_accuracy: 0.7744 - val_loss: 0.4700 Epoch 116/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8053 - loss: 0.8196 - val_accuracy: 0.7900 - val_loss: 0.4480 Epoch 117/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8037 - loss: 0.8325 - val_accuracy: 0.7911 - val_loss: 0.4474 Epoch 118/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8171 - loss: 0.7765 - val_accuracy: 0.7822 - val_loss: 0.4582 Epoch 119/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8148 - loss: 0.8022 - val_accuracy: 0.7833 - val_loss: 0.4537 Epoch 120/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8055 - loss: 0.8251 - val_accuracy: 0.7878 - val_loss: 0.4485 Epoch 121/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8099 - loss: 0.8167 - val_accuracy: 0.7867 - val_loss: 0.4529 Epoch 122/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8075 - loss: 0.8147 - val_accuracy: 0.7867 - val_loss: 0.4441 Epoch 123/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8151 - loss: 0.8389 - val_accuracy: 0.7878 - val_loss: 0.4502 Epoch 124/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8285 - loss: 0.7956 - val_accuracy: 0.7894 - val_loss: 0.4491 Epoch 125/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 5ms/step - accuracy: 0.8016 - loss: 0.8470 - val_accuracy: 0.7900 - val_loss: 0.4478 Epoch 126/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 5ms/step - accuracy: 0.8088 - loss: 0.8199 - val_accuracy: 0.7844 - val_loss: 0.4519 Epoch 127/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.8122 - loss: 0.8177 - val_accuracy: 0.7878 - val_loss: 0.4484 Epoch 128/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8189 - loss: 0.7953 - val_accuracy: 0.7861 - val_loss: 0.4427 Epoch 129/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8127 - loss: 0.8118 - val_accuracy: 0.7817 - val_loss: 0.4555 Epoch 130/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8164 - loss: 0.7931 - val_accuracy: 0.7828 - val_loss: 0.4561 Epoch 131/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8159 - loss: 0.8291 - val_accuracy: 0.7944 - val_loss: 0.4393 Epoch 132/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8200 - loss: 0.8125 - val_accuracy: 0.7911 - val_loss: 0.4497 Epoch 133/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8142 - loss: 0.8311 - val_accuracy: 0.7944 - val_loss: 0.4396 Epoch 134/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8121 - loss: 0.8051 - val_accuracy: 0.7956 - val_loss: 0.4439 Epoch 135/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8090 - loss: 0.8189 - val_accuracy: 0.7878 - val_loss: 0.4530 Epoch 136/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8155 - loss: 0.8123 - val_accuracy: 0.7967 - val_loss: 0.4469 Epoch 137/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 4ms/step - accuracy: 0.8145 - loss: 0.8134 - val_accuracy: 0.7889 - val_loss: 0.4538 Epoch 138/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 3s 9ms/step - accuracy: 0.8057 - loss: 0.8355 - val_accuracy: 0.7933 - val_loss: 0.4454 Epoch 139/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 2s 5ms/step - accuracy: 0.8165 - loss: 0.7860 - val_accuracy: 0.7950 - val_loss: 0.4527 Epoch 140/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8135 - loss: 0.8073 - val_accuracy: 0.8000 - val_loss: 0.4361 Epoch 141/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8106 - loss: 0.8265 - val_accuracy: 0.7928 - val_loss: 0.4462 Epoch 142/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8170 - loss: 0.8056 - val_accuracy: 0.7872 - val_loss: 0.4496 Epoch 143/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8121 - loss: 0.7955 - val_accuracy: 0.7839 - val_loss: 0.4563 Epoch 144/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8191 - loss: 0.8018 - val_accuracy: 0.7911 - val_loss: 0.4467 Epoch 145/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8170 - loss: 0.8026 - val_accuracy: 0.7894 - val_loss: 0.4466 Epoch 146/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8051 - loss: 0.8109 - val_accuracy: 0.7972 - val_loss: 0.4550 Epoch 147/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8186 - loss: 0.8042 - val_accuracy: 0.7911 - val_loss: 0.4544 Epoch 148/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8164 - loss: 0.8063 - val_accuracy: 0.7883 - val_loss: 0.4537 Epoch 149/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8117 - loss: 0.8137 - val_accuracy: 0.7856 - val_loss: 0.4557 Epoch 150/150 219/219 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.8092 - loss: 0.8060 - val_accuracy: 0.7911 - val_loss: 0.4469
#plot accuracy curve
plot(history11,'accuracy')
#plot loss curve
plot(history11,'loss')
#model comparison
results.loc[11] = [2,[64,32],['relu','tanh'],100,32,"sgd",[0.001,0.9],"xavier",["dropout:0.2","BatchNormalization"],history11.history["loss"][-1],history11.history["val_loss"][-1],history11.history["accuracy"][-1],history11.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
#model performance for training data
model_performance_classification(model11, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.848 | 0.848 | 0.884197 | 0.857765 |
#Model performance fro validation data
model_performance_classification(model11, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.791111 | 0.791111 | 0.829434 | 0.803487 |
Observations
- With this model,we will get more fluctuations to the response.No improvement in performance.
Model Performance Improvement¶
Neural Network with Adam Optimizer¶
Model 12
- With Adam optimizer model 7.
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively
- batch size=32
#Clear keras session
tf.keras.backend.clear_session()
#Define model
model12 = Sequential()
model12.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model12.add(Dense(32,activation="tanh"))
model12.add(Dense(1,activation="sigmoid"))
#Model summary
model12.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#Compile the model
optimizer = tf.keras.optimizers.Adam() # defining SGD as the optimizer to be used
model12.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training
start = time.time()
history12 = model12.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/150 219/219 - 3s - 13ms/step - accuracy: 0.7003 - loss: 1.1876 - val_accuracy: 0.6472 - val_loss: 0.6385 Epoch 2/150 219/219 - 1s - 4ms/step - accuracy: 0.7121 - loss: 1.1250 - val_accuracy: 0.7078 - val_loss: 0.5495 Epoch 3/150 219/219 - 1s - 3ms/step - accuracy: 0.7176 - loss: 1.0937 - val_accuracy: 0.7294 - val_loss: 0.5232 Epoch 4/150 219/219 - 1s - 3ms/step - accuracy: 0.7250 - loss: 1.0662 - val_accuracy: 0.7906 - val_loss: 0.4608 Epoch 5/150 219/219 - 1s - 3ms/step - accuracy: 0.7483 - loss: 1.0445 - val_accuracy: 0.7783 - val_loss: 0.4753 Epoch 6/150 219/219 - 1s - 3ms/step - accuracy: 0.7627 - loss: 1.0219 - val_accuracy: 0.7411 - val_loss: 0.5249 Epoch 7/150 219/219 - 1s - 3ms/step - accuracy: 0.7679 - loss: 0.9969 - val_accuracy: 0.7289 - val_loss: 0.5355 Epoch 8/150 219/219 - 1s - 3ms/step - accuracy: 0.7744 - loss: 0.9800 - val_accuracy: 0.7644 - val_loss: 0.5012 Epoch 9/150 219/219 - 1s - 3ms/step - accuracy: 0.7774 - loss: 0.9661 - val_accuracy: 0.7572 - val_loss: 0.5140 Epoch 10/150 219/219 - 1s - 3ms/step - accuracy: 0.7810 - loss: 0.9532 - val_accuracy: 0.7322 - val_loss: 0.5484 Epoch 11/150 219/219 - 1s - 3ms/step - accuracy: 0.7809 - loss: 0.9476 - val_accuracy: 0.7744 - val_loss: 0.4821 Epoch 12/150 219/219 - 1s - 3ms/step - accuracy: 0.7894 - loss: 0.9342 - val_accuracy: 0.7622 - val_loss: 0.5004 Epoch 13/150 219/219 - 1s - 3ms/step - accuracy: 0.7911 - loss: 0.9278 - val_accuracy: 0.8000 - val_loss: 0.4292 Epoch 14/150 219/219 - 1s - 3ms/step - accuracy: 0.7907 - loss: 0.9257 - val_accuracy: 0.7822 - val_loss: 0.4704 Epoch 15/150 219/219 - 1s - 3ms/step - accuracy: 0.7896 - loss: 0.9186 - val_accuracy: 0.7533 - val_loss: 0.4908 Epoch 16/150 219/219 - 1s - 3ms/step - accuracy: 0.7914 - loss: 0.9104 - val_accuracy: 0.7583 - val_loss: 0.5046 Epoch 17/150 219/219 - 1s - 4ms/step - accuracy: 0.7919 - loss: 0.9049 - val_accuracy: 0.7867 - val_loss: 0.4640 Epoch 18/150 219/219 - 1s - 6ms/step - accuracy: 0.7946 - loss: 0.9023 - val_accuracy: 0.7672 - val_loss: 0.4871 Epoch 19/150 219/219 - 1s - 7ms/step - accuracy: 0.7980 - loss: 0.8934 - val_accuracy: 0.7683 - val_loss: 0.4727 Epoch 20/150 219/219 - 1s - 3ms/step - accuracy: 0.7986 - loss: 0.8934 - val_accuracy: 0.7550 - val_loss: 0.4971 Epoch 21/150 219/219 - 1s - 3ms/step - accuracy: 0.7963 - loss: 0.8925 - val_accuracy: 0.7928 - val_loss: 0.4341 Epoch 22/150 219/219 - 1s - 3ms/step - accuracy: 0.8039 - loss: 0.8804 - val_accuracy: 0.7806 - val_loss: 0.4665 Epoch 23/150 219/219 - 1s - 3ms/step - accuracy: 0.8051 - loss: 0.8779 - val_accuracy: 0.7844 - val_loss: 0.4590 Epoch 24/150 219/219 - 1s - 3ms/step - accuracy: 0.8007 - loss: 0.8757 - val_accuracy: 0.7656 - val_loss: 0.4953 Epoch 25/150 219/219 - 1s - 3ms/step - accuracy: 0.8037 - loss: 0.8698 - val_accuracy: 0.7650 - val_loss: 0.4908 Epoch 26/150 219/219 - 1s - 3ms/step - accuracy: 0.8009 - loss: 0.8726 - val_accuracy: 0.7994 - val_loss: 0.4346 Epoch 27/150 219/219 - 1s - 3ms/step - accuracy: 0.8003 - loss: 0.8695 - val_accuracy: 0.7817 - val_loss: 0.4616 Epoch 28/150 219/219 - 1s - 3ms/step - accuracy: 0.8053 - loss: 0.8609 - val_accuracy: 0.7706 - val_loss: 0.4852 Epoch 29/150 219/219 - 1s - 3ms/step - accuracy: 0.8066 - loss: 0.8581 - val_accuracy: 0.7872 - val_loss: 0.4531 Epoch 30/150 219/219 - 1s - 6ms/step - accuracy: 0.8069 - loss: 0.8569 - val_accuracy: 0.7750 - val_loss: 0.4825 Epoch 31/150 219/219 - 1s - 3ms/step - accuracy: 0.8037 - loss: 0.8550 - val_accuracy: 0.7739 - val_loss: 0.4710 Epoch 32/150 219/219 - 1s - 3ms/step - accuracy: 0.8103 - loss: 0.8464 - val_accuracy: 0.8044 - val_loss: 0.4265 Epoch 33/150 219/219 - 1s - 3ms/step - accuracy: 0.8101 - loss: 0.8481 - val_accuracy: 0.7556 - val_loss: 0.4951 Epoch 34/150 219/219 - 1s - 4ms/step - accuracy: 0.8107 - loss: 0.8460 - val_accuracy: 0.7917 - val_loss: 0.4446 Epoch 35/150 219/219 - 1s - 6ms/step - accuracy: 0.8127 - loss: 0.8366 - val_accuracy: 0.7939 - val_loss: 0.4528 Epoch 36/150 219/219 - 1s - 4ms/step - accuracy: 0.8103 - loss: 0.8443 - val_accuracy: 0.7722 - val_loss: 0.4948 Epoch 37/150 219/219 - 1s - 5ms/step - accuracy: 0.8091 - loss: 0.8392 - val_accuracy: 0.7856 - val_loss: 0.4600 Epoch 38/150 219/219 - 1s - 4ms/step - accuracy: 0.8100 - loss: 0.8382 - val_accuracy: 0.7156 - val_loss: 0.5752 Epoch 39/150 219/219 - 1s - 3ms/step - accuracy: 0.8111 - loss: 0.8290 - val_accuracy: 0.7856 - val_loss: 0.4643 Epoch 40/150 219/219 - 1s - 5ms/step - accuracy: 0.8170 - loss: 0.8287 - val_accuracy: 0.7372 - val_loss: 0.5486 Epoch 41/150 219/219 - 1s - 3ms/step - accuracy: 0.8100 - loss: 0.8307 - val_accuracy: 0.7706 - val_loss: 0.4985 Epoch 42/150 219/219 - 1s - 3ms/step - accuracy: 0.8151 - loss: 0.8248 - val_accuracy: 0.8000 - val_loss: 0.4391 Epoch 43/150 219/219 - 1s - 3ms/step - accuracy: 0.8167 - loss: 0.8236 - val_accuracy: 0.8039 - val_loss: 0.4294 Epoch 44/150 219/219 - 1s - 3ms/step - accuracy: 0.8169 - loss: 0.8202 - val_accuracy: 0.7778 - val_loss: 0.4535 Epoch 45/150 219/219 - 1s - 3ms/step - accuracy: 0.8179 - loss: 0.8155 - val_accuracy: 0.7900 - val_loss: 0.4600 Epoch 46/150 219/219 - 1s - 3ms/step - accuracy: 0.8186 - loss: 0.8140 - val_accuracy: 0.7872 - val_loss: 0.4467 Epoch 47/150 219/219 - 1s - 3ms/step - accuracy: 0.8141 - loss: 0.8101 - val_accuracy: 0.8050 - val_loss: 0.4255 Epoch 48/150 219/219 - 1s - 3ms/step - accuracy: 0.8166 - loss: 0.8111 - val_accuracy: 0.7822 - val_loss: 0.4790 Epoch 49/150 219/219 - 1s - 3ms/step - accuracy: 0.8226 - loss: 0.8088 - val_accuracy: 0.7544 - val_loss: 0.5083 Epoch 50/150 219/219 - 1s - 3ms/step - accuracy: 0.8196 - loss: 0.8047 - val_accuracy: 0.7889 - val_loss: 0.4706 Epoch 51/150 219/219 - 1s - 3ms/step - accuracy: 0.8190 - loss: 0.7980 - val_accuracy: 0.8150 - val_loss: 0.4013 Epoch 52/150 219/219 - 1s - 3ms/step - accuracy: 0.8189 - loss: 0.8029 - val_accuracy: 0.7833 - val_loss: 0.4545 Epoch 53/150 219/219 - 1s - 5ms/step - accuracy: 0.8237 - loss: 0.7953 - val_accuracy: 0.7878 - val_loss: 0.4644 Epoch 54/150 219/219 - 1s - 5ms/step - accuracy: 0.8239 - loss: 0.8021 - val_accuracy: 0.7739 - val_loss: 0.4740 Epoch 55/150 219/219 - 1s - 6ms/step - accuracy: 0.8196 - loss: 0.7995 - val_accuracy: 0.8261 - val_loss: 0.4081 Epoch 56/150 219/219 - 1s - 5ms/step - accuracy: 0.8254 - loss: 0.7924 - val_accuracy: 0.7622 - val_loss: 0.5008 Epoch 57/150 219/219 - 1s - 3ms/step - accuracy: 0.8199 - loss: 0.7939 - val_accuracy: 0.8061 - val_loss: 0.4246 Epoch 58/150 219/219 - 1s - 3ms/step - accuracy: 0.8237 - loss: 0.7913 - val_accuracy: 0.7794 - val_loss: 0.4563 Epoch 59/150 219/219 - 1s - 3ms/step - accuracy: 0.8207 - loss: 0.7861 - val_accuracy: 0.8278 - val_loss: 0.4072 Epoch 60/150 219/219 - 1s - 3ms/step - accuracy: 0.8224 - loss: 0.7862 - val_accuracy: 0.7933 - val_loss: 0.4598 Epoch 61/150 219/219 - 1s - 3ms/step - accuracy: 0.8276 - loss: 0.7815 - val_accuracy: 0.7839 - val_loss: 0.4764 Epoch 62/150 219/219 - 1s - 3ms/step - accuracy: 0.8270 - loss: 0.7785 - val_accuracy: 0.8150 - val_loss: 0.4171 Epoch 63/150 219/219 - 1s - 3ms/step - accuracy: 0.8283 - loss: 0.7796 - val_accuracy: 0.7839 - val_loss: 0.4588 Epoch 64/150 219/219 - 1s - 3ms/step - accuracy: 0.8269 - loss: 0.7765 - val_accuracy: 0.7872 - val_loss: 0.4566 Epoch 65/150 219/219 - 1s - 3ms/step - accuracy: 0.8296 - loss: 0.7680 - val_accuracy: 0.7856 - val_loss: 0.4655 Epoch 66/150 219/219 - 1s - 6ms/step - accuracy: 0.8279 - loss: 0.7717 - val_accuracy: 0.7561 - val_loss: 0.5221 Epoch 67/150 219/219 - 1s - 3ms/step - accuracy: 0.8226 - loss: 0.7749 - val_accuracy: 0.7872 - val_loss: 0.4676 Epoch 68/150 219/219 - 1s - 6ms/step - accuracy: 0.8241 - loss: 0.7635 - val_accuracy: 0.7817 - val_loss: 0.4707 Epoch 69/150 219/219 - 1s - 4ms/step - accuracy: 0.8290 - loss: 0.7631 - val_accuracy: 0.7728 - val_loss: 0.4929 Epoch 70/150 219/219 - 1s - 6ms/step - accuracy: 0.8286 - loss: 0.7642 - val_accuracy: 0.7978 - val_loss: 0.4495 Epoch 71/150 219/219 - 1s - 6ms/step - accuracy: 0.8326 - loss: 0.7603 - val_accuracy: 0.7917 - val_loss: 0.4644 Epoch 72/150 219/219 - 1s - 3ms/step - accuracy: 0.8229 - loss: 0.7654 - val_accuracy: 0.7911 - val_loss: 0.4614 Epoch 73/150 219/219 - 1s - 3ms/step - accuracy: 0.8304 - loss: 0.7533 - val_accuracy: 0.8094 - val_loss: 0.4387 Epoch 74/150 219/219 - 1s - 3ms/step - accuracy: 0.8314 - loss: 0.7588 - val_accuracy: 0.7939 - val_loss: 0.4626 Epoch 75/150 219/219 - 1s - 3ms/step - accuracy: 0.8329 - loss: 0.7495 - val_accuracy: 0.7667 - val_loss: 0.4841 Epoch 76/150 219/219 - 1s - 3ms/step - accuracy: 0.8314 - loss: 0.7486 - val_accuracy: 0.8072 - val_loss: 0.4361 Epoch 77/150 219/219 - 1s - 3ms/step - accuracy: 0.8337 - loss: 0.7460 - val_accuracy: 0.7911 - val_loss: 0.4690 Epoch 78/150 219/219 - 1s - 3ms/step - accuracy: 0.8339 - loss: 0.7444 - val_accuracy: 0.7789 - val_loss: 0.4838 Epoch 79/150 219/219 - 1s - 3ms/step - accuracy: 0.8320 - loss: 0.7458 - val_accuracy: 0.7828 - val_loss: 0.4892 Epoch 80/150 219/219 - 1s - 3ms/step - accuracy: 0.8313 - loss: 0.7433 - val_accuracy: 0.7872 - val_loss: 0.4516 Epoch 81/150 219/219 - 1s - 3ms/step - accuracy: 0.8327 - loss: 0.7451 - val_accuracy: 0.8111 - val_loss: 0.4353 Epoch 82/150 219/219 - 1s - 3ms/step - accuracy: 0.8327 - loss: 0.7369 - val_accuracy: 0.8067 - val_loss: 0.4408 Epoch 83/150 219/219 - 1s - 3ms/step - accuracy: 0.8356 - loss: 0.7290 - val_accuracy: 0.7833 - val_loss: 0.4954 Epoch 84/150 219/219 - 1s - 3ms/step - accuracy: 0.8307 - loss: 0.7353 - val_accuracy: 0.8067 - val_loss: 0.4406 Epoch 85/150 219/219 - 1s - 3ms/step - accuracy: 0.8367 - loss: 0.7302 - val_accuracy: 0.7911 - val_loss: 0.4859 Epoch 86/150 219/219 - 1s - 5ms/step - accuracy: 0.8329 - loss: 0.7306 - val_accuracy: 0.8122 - val_loss: 0.4327 Epoch 87/150 219/219 - 1s - 6ms/step - accuracy: 0.8356 - loss: 0.7252 - val_accuracy: 0.8017 - val_loss: 0.4467 Epoch 88/150 219/219 - 1s - 5ms/step - accuracy: 0.8373 - loss: 0.7256 - val_accuracy: 0.7983 - val_loss: 0.4639 Epoch 89/150 219/219 - 1s - 6ms/step - accuracy: 0.8389 - loss: 0.7183 - val_accuracy: 0.7844 - val_loss: 0.4794 Epoch 90/150 219/219 - 1s - 3ms/step - accuracy: 0.8361 - loss: 0.7226 - val_accuracy: 0.7939 - val_loss: 0.4637 Epoch 91/150 219/219 - 1s - 3ms/step - accuracy: 0.8403 - loss: 0.7124 - val_accuracy: 0.7817 - val_loss: 0.4792 Epoch 92/150 219/219 - 1s - 3ms/step - accuracy: 0.8414 - loss: 0.7091 - val_accuracy: 0.8117 - val_loss: 0.4246 Epoch 93/150 219/219 - 1s - 6ms/step - accuracy: 0.8357 - loss: 0.7213 - val_accuracy: 0.8078 - val_loss: 0.4404 Epoch 94/150 219/219 - 1s - 3ms/step - accuracy: 0.8377 - loss: 0.7113 - val_accuracy: 0.7939 - val_loss: 0.4621 Epoch 95/150 219/219 - 1s - 3ms/step - accuracy: 0.8390 - loss: 0.7014 - val_accuracy: 0.8278 - val_loss: 0.4083 Epoch 96/150 219/219 - 1s - 3ms/step - accuracy: 0.8426 - loss: 0.7066 - val_accuracy: 0.8006 - val_loss: 0.4509 Epoch 97/150 219/219 - 1s - 6ms/step - accuracy: 0.8369 - loss: 0.7021 - val_accuracy: 0.7822 - val_loss: 0.4813 Epoch 98/150 219/219 - 1s - 3ms/step - accuracy: 0.8424 - loss: 0.7002 - val_accuracy: 0.7944 - val_loss: 0.4586 Epoch 99/150 219/219 - 1s - 3ms/step - accuracy: 0.8420 - loss: 0.7003 - val_accuracy: 0.8039 - val_loss: 0.4495 Epoch 100/150 219/219 - 1s - 3ms/step - accuracy: 0.8406 - loss: 0.7002 - val_accuracy: 0.8017 - val_loss: 0.4516 Epoch 101/150 219/219 - 1s - 3ms/step - accuracy: 0.8484 - loss: 0.6892 - val_accuracy: 0.8300 - val_loss: 0.4072 Epoch 102/150 219/219 - 1s - 3ms/step - accuracy: 0.8416 - loss: 0.6881 - val_accuracy: 0.8000 - val_loss: 0.4704 Epoch 103/150 219/219 - 1s - 3ms/step - accuracy: 0.8463 - loss: 0.6847 - val_accuracy: 0.7211 - val_loss: 0.6031 Epoch 104/150 219/219 - 2s - 7ms/step - accuracy: 0.8369 - loss: 0.6892 - val_accuracy: 0.8050 - val_loss: 0.4399 Epoch 105/150 219/219 - 1s - 6ms/step - accuracy: 0.8424 - loss: 0.6864 - val_accuracy: 0.8117 - val_loss: 0.4397 Epoch 106/150 219/219 - 1s - 3ms/step - accuracy: 0.8481 - loss: 0.6765 - val_accuracy: 0.7578 - val_loss: 0.5331 Epoch 107/150 219/219 - 1s - 5ms/step - accuracy: 0.8489 - loss: 0.6746 - val_accuracy: 0.7617 - val_loss: 0.5351 Epoch 108/150 219/219 - 1s - 3ms/step - accuracy: 0.8459 - loss: 0.6828 - val_accuracy: 0.7706 - val_loss: 0.5186 Epoch 109/150 219/219 - 1s - 3ms/step - accuracy: 0.8416 - loss: 0.6779 - val_accuracy: 0.7756 - val_loss: 0.4957 Epoch 110/150 219/219 - 1s - 3ms/step - accuracy: 0.8473 - loss: 0.6687 - val_accuracy: 0.7700 - val_loss: 0.5200 Epoch 111/150 219/219 - 1s - 3ms/step - accuracy: 0.8480 - loss: 0.6689 - val_accuracy: 0.7928 - val_loss: 0.4828 Epoch 112/150 219/219 - 1s - 6ms/step - accuracy: 0.8544 - loss: 0.6666 - val_accuracy: 0.7650 - val_loss: 0.5256 Epoch 113/150 219/219 - 1s - 3ms/step - accuracy: 0.8501 - loss: 0.6628 - val_accuracy: 0.7906 - val_loss: 0.4822 Epoch 114/150 219/219 - 1s - 3ms/step - accuracy: 0.8506 - loss: 0.6574 - val_accuracy: 0.7906 - val_loss: 0.4851 Epoch 115/150 219/219 - 1s - 3ms/step - accuracy: 0.8517 - loss: 0.6643 - val_accuracy: 0.7961 - val_loss: 0.4734 Epoch 116/150 219/219 - 1s - 3ms/step - accuracy: 0.8550 - loss: 0.6565 - val_accuracy: 0.8133 - val_loss: 0.4361 Epoch 117/150 219/219 - 1s - 3ms/step - accuracy: 0.8530 - loss: 0.6471 - val_accuracy: 0.7906 - val_loss: 0.4913 Epoch 118/150 219/219 - 1s - 3ms/step - accuracy: 0.8507 - loss: 0.6548 - val_accuracy: 0.7828 - val_loss: 0.5005 Epoch 119/150 219/219 - 1s - 3ms/step - accuracy: 0.8481 - loss: 0.6511 - val_accuracy: 0.8300 - val_loss: 0.4175 Epoch 120/150 219/219 - 1s - 3ms/step - accuracy: 0.8537 - loss: 0.6531 - val_accuracy: 0.8044 - val_loss: 0.4587 Epoch 121/150 219/219 - 1s - 4ms/step - accuracy: 0.8523 - loss: 0.6453 - val_accuracy: 0.7556 - val_loss: 0.5537 Epoch 122/150 219/219 - 1s - 6ms/step - accuracy: 0.8540 - loss: 0.6431 - val_accuracy: 0.7906 - val_loss: 0.4931 Epoch 123/150 219/219 - 1s - 4ms/step - accuracy: 0.8584 - loss: 0.6374 - val_accuracy: 0.7983 - val_loss: 0.4510 Epoch 124/150 219/219 - 1s - 5ms/step - accuracy: 0.8571 - loss: 0.6381 - val_accuracy: 0.7928 - val_loss: 0.4843 Epoch 125/150 219/219 - 1s - 3ms/step - accuracy: 0.8526 - loss: 0.6315 - val_accuracy: 0.8150 - val_loss: 0.4552 Epoch 126/150 219/219 - 1s - 3ms/step - accuracy: 0.8569 - loss: 0.6287 - val_accuracy: 0.7794 - val_loss: 0.5091 Epoch 127/150 219/219 - 1s - 3ms/step - accuracy: 0.8579 - loss: 0.6239 - val_accuracy: 0.8061 - val_loss: 0.4592 Epoch 128/150 219/219 - 1s - 3ms/step - accuracy: 0.8529 - loss: 0.6305 - val_accuracy: 0.8133 - val_loss: 0.4542 Epoch 129/150 219/219 - 1s - 3ms/step - accuracy: 0.8597 - loss: 0.6247 - val_accuracy: 0.7806 - val_loss: 0.4928 Epoch 130/150 219/219 - 1s - 6ms/step - accuracy: 0.8561 - loss: 0.6250 - val_accuracy: 0.7622 - val_loss: 0.5505 Epoch 131/150 219/219 - 1s - 3ms/step - accuracy: 0.8597 - loss: 0.6193 - val_accuracy: 0.7950 - val_loss: 0.4707 Epoch 132/150 219/219 - 1s - 3ms/step - accuracy: 0.8571 - loss: 0.6187 - val_accuracy: 0.8017 - val_loss: 0.4711 Epoch 133/150 219/219 - 1s - 3ms/step - accuracy: 0.8621 - loss: 0.6065 - val_accuracy: 0.8028 - val_loss: 0.4666 Epoch 134/150 219/219 - 1s - 3ms/step - accuracy: 0.8620 - loss: 0.6109 - val_accuracy: 0.8022 - val_loss: 0.4577 Epoch 135/150 219/219 - 1s - 3ms/step - accuracy: 0.8614 - loss: 0.6058 - val_accuracy: 0.7961 - val_loss: 0.4746 Epoch 136/150 219/219 - 1s - 3ms/step - accuracy: 0.8614 - loss: 0.6083 - val_accuracy: 0.8111 - val_loss: 0.4510 Epoch 137/150 219/219 - 1s - 3ms/step - accuracy: 0.8597 - loss: 0.6050 - val_accuracy: 0.7889 - val_loss: 0.4962 Epoch 138/150 219/219 - 1s - 7ms/step - accuracy: 0.8644 - loss: 0.6010 - val_accuracy: 0.8022 - val_loss: 0.4608 Epoch 139/150 219/219 - 1s - 7ms/step - accuracy: 0.8586 - loss: 0.6033 - val_accuracy: 0.7639 - val_loss: 0.5451 Epoch 140/150 219/219 - 1s - 3ms/step - accuracy: 0.8671 - loss: 0.5932 - val_accuracy: 0.7950 - val_loss: 0.4928 Epoch 141/150 219/219 - 1s - 6ms/step - accuracy: 0.8620 - loss: 0.6061 - val_accuracy: 0.7656 - val_loss: 0.5397 Epoch 142/150 219/219 - 1s - 3ms/step - accuracy: 0.8626 - loss: 0.5922 - val_accuracy: 0.8017 - val_loss: 0.4755 Epoch 143/150 219/219 - 1s - 3ms/step - accuracy: 0.8657 - loss: 0.5918 - val_accuracy: 0.8200 - val_loss: 0.4569 Epoch 144/150 219/219 - 1s - 3ms/step - accuracy: 0.8677 - loss: 0.5860 - val_accuracy: 0.8006 - val_loss: 0.4842 Epoch 145/150 219/219 - 1s - 3ms/step - accuracy: 0.8706 - loss: 0.5824 - val_accuracy: 0.7933 - val_loss: 0.4853 Epoch 146/150 219/219 - 1s - 3ms/step - accuracy: 0.8684 - loss: 0.5825 - val_accuracy: 0.8061 - val_loss: 0.4834 Epoch 147/150 219/219 - 1s - 3ms/step - accuracy: 0.8686 - loss: 0.5873 - val_accuracy: 0.7867 - val_loss: 0.5104 Epoch 148/150 219/219 - 1s - 3ms/step - accuracy: 0.8660 - loss: 0.5817 - val_accuracy: 0.8050 - val_loss: 0.4900 Epoch 149/150 219/219 - 1s - 3ms/step - accuracy: 0.8671 - loss: 0.5737 - val_accuracy: 0.8139 - val_loss: 0.4713 Epoch 150/150 219/219 - 1s - 3ms/step - accuracy: 0.8667 - loss: 0.5713 - val_accuracy: 0.7861 - val_loss: 0.5244
#plot Accuracy
plot(history12,'accuracy')
#plot Loss
plot(history12,'loss')
#Comparing model performance
results.loc[12] = [2,[64,32],['relu','tanh'],150,32,"adam",['-', "-"],"xavier","-",history12.history["loss"][-1],history12.history["val_loss"][-1],history12.history["accuracy"][-1],history12.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
#model performance for training data
model_performance_classification(model12, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.859571 | 0.859571 | 0.898801 | 0.86913 |
#model performance for validation
model_performance_classification(model12, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.786111 | 0.786111 | 0.825131 | 0.798836 |
Observations
- Training and validation loss have reduced to 55% and 47.9% respectively.
- Training and Validation accuracy and recall have improved but not as 3,response shows it has more oscillations.
Neural Network with Adam Optimizer and Dropout¶
Let's add dropout layer with adam optimizer.
Adam (Adaptive Moment Estimation) adjusts the learning rate in an adaptive manner This enables it to converge faster and more reliably compared to SGD with momentum, especially in scenarios with large datasets and complex optimization landscapes
With Adam optimizer model 7.
- 1 input, 2 hidden(64,32), 1 output layers
- relu and tanh activations for the first and second hidden layer respectively.
- batch size=32
- dropout layer of 0.2
#clear keras session
tf.keras.backend.clear_session()
#define model
model14 = Sequential()
model14.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model14.add(Dropout(0.5))
model14.add(Dense(32,activation="tanh"))
model14.add(Dropout(0.5))
model14.add(Dense(1,activation="sigmoid"))
#model summary
model14.summary()
Model: "sequential_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense_3 (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout_2 (Dropout) │ (None, 64) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_4 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout_3 (Dropout) │ (None, 32) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_5 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#Compile model
optimizer = tf.keras.optimizers.Adam() # defining SGD as the optimizer to be used
model14.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#Start training
start = time.time()
history14 = model14.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/150 219/219 - 2s - 10ms/step - accuracy: 0.6087 - loss: 1.3766 - val_accuracy: 0.6967 - val_loss: 0.5926 Epoch 2/150 219/219 - 1s - 3ms/step - accuracy: 0.6597 - loss: 1.2665 - val_accuracy: 0.6750 - val_loss: 0.6019 Epoch 3/150 219/219 - 1s - 4ms/step - accuracy: 0.6713 - loss: 1.2171 - val_accuracy: 0.7017 - val_loss: 0.5788 Epoch 4/150 219/219 - 1s - 6ms/step - accuracy: 0.6797 - loss: 1.2070 - val_accuracy: 0.7306 - val_loss: 0.5396 Epoch 5/150 219/219 - 1s - 6ms/step - accuracy: 0.6947 - loss: 1.1840 - val_accuracy: 0.7122 - val_loss: 0.5533 Epoch 6/150 219/219 - 1s - 3ms/step - accuracy: 0.6910 - loss: 1.1707 - val_accuracy: 0.7233 - val_loss: 0.5448 Epoch 7/150 219/219 - 1s - 6ms/step - accuracy: 0.6950 - loss: 1.1656 - val_accuracy: 0.7417 - val_loss: 0.5256 Epoch 8/150 219/219 - 1s - 3ms/step - accuracy: 0.7040 - loss: 1.1550 - val_accuracy: 0.7161 - val_loss: 0.5523 Epoch 9/150 219/219 - 1s - 6ms/step - accuracy: 0.7093 - loss: 1.1338 - val_accuracy: 0.7339 - val_loss: 0.5312 Epoch 10/150 219/219 - 1s - 6ms/step - accuracy: 0.7160 - loss: 1.1361 - val_accuracy: 0.7511 - val_loss: 0.5121 Epoch 11/150 219/219 - 1s - 3ms/step - accuracy: 0.7220 - loss: 1.1165 - val_accuracy: 0.7472 - val_loss: 0.5175 Epoch 12/150 219/219 - 1s - 5ms/step - accuracy: 0.7251 - loss: 1.1118 - val_accuracy: 0.7489 - val_loss: 0.5079 Epoch 13/150 219/219 - 1s - 3ms/step - accuracy: 0.7200 - loss: 1.1109 - val_accuracy: 0.7717 - val_loss: 0.4844 Epoch 14/150 219/219 - 1s - 6ms/step - accuracy: 0.7353 - loss: 1.0993 - val_accuracy: 0.7422 - val_loss: 0.5177 Epoch 15/150 219/219 - 1s - 3ms/step - accuracy: 0.7320 - loss: 1.1040 - val_accuracy: 0.7311 - val_loss: 0.5330 Epoch 16/150 219/219 - 2s - 7ms/step - accuracy: 0.7337 - loss: 1.0858 - val_accuracy: 0.7450 - val_loss: 0.5075 Epoch 17/150 219/219 - 1s - 6ms/step - accuracy: 0.7364 - loss: 1.0856 - val_accuracy: 0.7328 - val_loss: 0.5345 Epoch 18/150 219/219 - 1s - 4ms/step - accuracy: 0.7436 - loss: 1.0722 - val_accuracy: 0.7611 - val_loss: 0.4987 Epoch 19/150 219/219 - 1s - 3ms/step - accuracy: 0.7553 - loss: 1.0645 - val_accuracy: 0.7561 - val_loss: 0.4983 Epoch 20/150 219/219 - 1s - 6ms/step - accuracy: 0.7450 - loss: 1.0690 - val_accuracy: 0.7672 - val_loss: 0.4920 Epoch 21/150 219/219 - 1s - 6ms/step - accuracy: 0.7547 - loss: 1.0488 - val_accuracy: 0.7667 - val_loss: 0.4931 Epoch 22/150 219/219 - 1s - 3ms/step - accuracy: 0.7621 - loss: 1.0533 - val_accuracy: 0.7822 - val_loss: 0.4691 Epoch 23/150 219/219 - 1s - 3ms/step - accuracy: 0.7560 - loss: 1.0552 - val_accuracy: 0.7833 - val_loss: 0.4651 Epoch 24/150 219/219 - 1s - 3ms/step - accuracy: 0.7684 - loss: 1.0415 - val_accuracy: 0.7767 - val_loss: 0.4715 Epoch 25/150 219/219 - 1s - 3ms/step - accuracy: 0.7716 - loss: 1.0283 - val_accuracy: 0.7556 - val_loss: 0.5024 Epoch 26/150 219/219 - 1s - 3ms/step - accuracy: 0.7677 - loss: 1.0212 - val_accuracy: 0.8039 - val_loss: 0.4297 Epoch 27/150 219/219 - 1s - 6ms/step - accuracy: 0.7720 - loss: 1.0245 - val_accuracy: 0.7750 - val_loss: 0.4711 Epoch 28/150 219/219 - 2s - 8ms/step - accuracy: 0.7750 - loss: 1.0107 - val_accuracy: 0.7794 - val_loss: 0.4721 Epoch 29/150 219/219 - 2s - 8ms/step - accuracy: 0.7799 - loss: 1.0083 - val_accuracy: 0.7867 - val_loss: 0.4580 Epoch 30/150 219/219 - 3s - 12ms/step - accuracy: 0.7851 - loss: 1.0105 - val_accuracy: 0.7717 - val_loss: 0.4750 Epoch 31/150 219/219 - 2s - 11ms/step - accuracy: 0.7749 - loss: 1.0186 - val_accuracy: 0.7744 - val_loss: 0.4736 Epoch 32/150 219/219 - 2s - 9ms/step - accuracy: 0.7824 - loss: 1.0022 - val_accuracy: 0.7956 - val_loss: 0.4471 Epoch 33/150 219/219 - 1s - 6ms/step - accuracy: 0.7877 - loss: 1.0008 - val_accuracy: 0.7894 - val_loss: 0.4553 Epoch 34/150 219/219 - 1s - 3ms/step - accuracy: 0.7890 - loss: 0.9858 - val_accuracy: 0.7906 - val_loss: 0.4584 Epoch 35/150 219/219 - 1s - 3ms/step - accuracy: 0.7910 - loss: 0.9919 - val_accuracy: 0.7794 - val_loss: 0.4715 Epoch 36/150 219/219 - 1s - 3ms/step - accuracy: 0.7833 - loss: 0.9990 - val_accuracy: 0.7911 - val_loss: 0.4584 Epoch 37/150 219/219 - 1s - 3ms/step - accuracy: 0.7954 - loss: 0.9867 - val_accuracy: 0.7572 - val_loss: 0.4915 Epoch 38/150 219/219 - 1s - 3ms/step - accuracy: 0.7899 - loss: 0.9611 - val_accuracy: 0.7806 - val_loss: 0.4510 Epoch 39/150 219/219 - 1s - 3ms/step - accuracy: 0.7950 - loss: 0.9804 - val_accuracy: 0.7683 - val_loss: 0.4774 Epoch 40/150 219/219 - 1s - 3ms/step - accuracy: 0.7889 - loss: 0.9928 - val_accuracy: 0.7994 - val_loss: 0.4373 Epoch 41/150 219/219 - 1s - 5ms/step - accuracy: 0.7933 - loss: 0.9665 - val_accuracy: 0.7939 - val_loss: 0.4498 Epoch 42/150 219/219 - 1s - 6ms/step - accuracy: 0.7854 - loss: 0.9865 - val_accuracy: 0.7761 - val_loss: 0.4728 Epoch 43/150 219/219 - 1s - 5ms/step - accuracy: 0.7899 - loss: 0.9763 - val_accuracy: 0.7789 - val_loss: 0.4729 Epoch 44/150 219/219 - 1s - 4ms/step - accuracy: 0.7937 - loss: 0.9797 - val_accuracy: 0.7756 - val_loss: 0.4695 Epoch 45/150 219/219 - 1s - 3ms/step - accuracy: 0.7871 - loss: 0.9824 - val_accuracy: 0.8022 - val_loss: 0.4465 Epoch 46/150 219/219 - 1s - 3ms/step - accuracy: 0.7906 - loss: 0.9786 - val_accuracy: 0.7933 - val_loss: 0.4550 Epoch 47/150 219/219 - 1s - 3ms/step - accuracy: 0.7887 - loss: 0.9799 - val_accuracy: 0.7817 - val_loss: 0.4592 Epoch 48/150 219/219 - 1s - 3ms/step - accuracy: 0.7956 - loss: 0.9572 - val_accuracy: 0.7978 - val_loss: 0.4414 Epoch 49/150 219/219 - 1s - 3ms/step - accuracy: 0.7920 - loss: 0.9687 - val_accuracy: 0.7694 - val_loss: 0.4817 Epoch 50/150 219/219 - 1s - 5ms/step - accuracy: 0.7923 - loss: 0.9570 - val_accuracy: 0.7917 - val_loss: 0.4412 Epoch 51/150 219/219 - 1s - 3ms/step - accuracy: 0.7943 - loss: 0.9695 - val_accuracy: 0.7850 - val_loss: 0.4646 Epoch 52/150 219/219 - 1s - 6ms/step - accuracy: 0.7984 - loss: 0.9659 - val_accuracy: 0.7767 - val_loss: 0.4742 Epoch 53/150 219/219 - 1s - 6ms/step - accuracy: 0.7896 - loss: 0.9644 - val_accuracy: 0.7772 - val_loss: 0.4707 Epoch 54/150 219/219 - 1s - 3ms/step - accuracy: 0.7897 - loss: 0.9711 - val_accuracy: 0.8033 - val_loss: 0.4367 Epoch 55/150 219/219 - 2s - 7ms/step - accuracy: 0.7971 - loss: 0.9576 - val_accuracy: 0.8094 - val_loss: 0.4341 Epoch 56/150 219/219 - 1s - 6ms/step - accuracy: 0.7947 - loss: 0.9597 - val_accuracy: 0.7989 - val_loss: 0.4543 Epoch 57/150 219/219 - 1s - 6ms/step - accuracy: 0.7943 - loss: 0.9670 - val_accuracy: 0.7822 - val_loss: 0.4567 Epoch 58/150 219/219 - 1s - 3ms/step - accuracy: 0.8021 - loss: 0.9471 - val_accuracy: 0.7806 - val_loss: 0.4607 Epoch 59/150 219/219 - 1s - 3ms/step - accuracy: 0.7954 - loss: 0.9687 - val_accuracy: 0.7822 - val_loss: 0.4635 Epoch 60/150 219/219 - 1s - 3ms/step - accuracy: 0.7983 - loss: 0.9497 - val_accuracy: 0.8056 - val_loss: 0.4301 Epoch 61/150 219/219 - 1s - 6ms/step - accuracy: 0.8034 - loss: 0.9462 - val_accuracy: 0.7806 - val_loss: 0.4665 Epoch 62/150 219/219 - 1s - 3ms/step - accuracy: 0.8037 - loss: 0.9525 - val_accuracy: 0.7794 - val_loss: 0.4616 Epoch 63/150 219/219 - 1s - 3ms/step - accuracy: 0.7891 - loss: 0.9548 - val_accuracy: 0.7917 - val_loss: 0.4488 Epoch 64/150 219/219 - 1s - 6ms/step - accuracy: 0.7981 - loss: 0.9477 - val_accuracy: 0.7900 - val_loss: 0.4450 Epoch 65/150 219/219 - 1s - 3ms/step - accuracy: 0.8044 - loss: 0.9504 - val_accuracy: 0.8028 - val_loss: 0.4338 Epoch 66/150 219/219 - 1s - 5ms/step - accuracy: 0.7959 - loss: 0.9497 - val_accuracy: 0.7922 - val_loss: 0.4478 Epoch 67/150 219/219 - 1s - 3ms/step - accuracy: 0.7979 - loss: 0.9589 - val_accuracy: 0.7922 - val_loss: 0.4553 Epoch 68/150 219/219 - 1s - 3ms/step - accuracy: 0.7921 - loss: 0.9553 - val_accuracy: 0.7967 - val_loss: 0.4483 Epoch 69/150 219/219 - 1s - 3ms/step - accuracy: 0.7943 - loss: 0.9545 - val_accuracy: 0.7983 - val_loss: 0.4412 Epoch 70/150 219/219 - 2s - 7ms/step - accuracy: 0.7987 - loss: 0.9511 - val_accuracy: 0.7911 - val_loss: 0.4549 Epoch 71/150 219/219 - 1s - 5ms/step - accuracy: 0.8000 - loss: 0.9548 - val_accuracy: 0.7961 - val_loss: 0.4485 Epoch 72/150 219/219 - 1s - 5ms/step - accuracy: 0.7940 - loss: 0.9560 - val_accuracy: 0.7839 - val_loss: 0.4688 Epoch 73/150 219/219 - 1s - 4ms/step - accuracy: 0.7939 - loss: 0.9524 - val_accuracy: 0.8017 - val_loss: 0.4403 Epoch 74/150 219/219 - 1s - 6ms/step - accuracy: 0.7981 - loss: 0.9418 - val_accuracy: 0.7989 - val_loss: 0.4479 Epoch 75/150 219/219 - 1s - 6ms/step - accuracy: 0.8003 - loss: 0.9464 - val_accuracy: 0.7756 - val_loss: 0.4741 Epoch 76/150 219/219 - 1s - 3ms/step - accuracy: 0.7939 - loss: 0.9507 - val_accuracy: 0.7900 - val_loss: 0.4574 Epoch 77/150 219/219 - 1s - 5ms/step - accuracy: 0.8020 - loss: 0.9388 - val_accuracy: 0.7906 - val_loss: 0.4512 Epoch 78/150 219/219 - 1s - 3ms/step - accuracy: 0.7977 - loss: 0.9455 - val_accuracy: 0.7950 - val_loss: 0.4465 Epoch 79/150 219/219 - 1s - 3ms/step - accuracy: 0.8070 - loss: 0.9383 - val_accuracy: 0.7889 - val_loss: 0.4528 Epoch 80/150 219/219 - 1s - 3ms/step - accuracy: 0.7999 - loss: 0.9466 - val_accuracy: 0.7872 - val_loss: 0.4574 Epoch 81/150 219/219 - 1s - 3ms/step - accuracy: 0.8017 - loss: 0.9395 - val_accuracy: 0.7922 - val_loss: 0.4513 Epoch 82/150 219/219 - 1s - 3ms/step - accuracy: 0.7951 - loss: 0.9532 - val_accuracy: 0.7811 - val_loss: 0.4685 Epoch 83/150 219/219 - 1s - 6ms/step - accuracy: 0.7983 - loss: 0.9390 - val_accuracy: 0.7872 - val_loss: 0.4567 Epoch 84/150 219/219 - 1s - 3ms/step - accuracy: 0.8024 - loss: 0.9345 - val_accuracy: 0.7989 - val_loss: 0.4388 Epoch 85/150 219/219 - 1s - 7ms/step - accuracy: 0.8057 - loss: 0.9334 - val_accuracy: 0.7828 - val_loss: 0.4638 Epoch 86/150 219/219 - 1s - 7ms/step - accuracy: 0.7980 - loss: 0.9379 - val_accuracy: 0.7794 - val_loss: 0.4625 Epoch 87/150 219/219 - 1s - 3ms/step - accuracy: 0.8023 - loss: 0.9477 - val_accuracy: 0.7739 - val_loss: 0.4669 Epoch 88/150 219/219 - 1s - 6ms/step - accuracy: 0.7999 - loss: 0.9208 - val_accuracy: 0.7761 - val_loss: 0.4632 Epoch 89/150 219/219 - 1s - 3ms/step - accuracy: 0.7991 - loss: 0.9336 - val_accuracy: 0.7894 - val_loss: 0.4412 Epoch 90/150 219/219 - 1s - 6ms/step - accuracy: 0.8059 - loss: 0.9284 - val_accuracy: 0.7967 - val_loss: 0.4433 Epoch 91/150 219/219 - 1s - 6ms/step - accuracy: 0.7999 - loss: 0.9405 - val_accuracy: 0.7800 - val_loss: 0.4645 Epoch 92/150 219/219 - 1s - 3ms/step - accuracy: 0.7989 - loss: 0.9305 - val_accuracy: 0.7917 - val_loss: 0.4505 Epoch 93/150 219/219 - 1s - 3ms/step - accuracy: 0.8023 - loss: 0.9327 - val_accuracy: 0.7828 - val_loss: 0.4606 Epoch 94/150 219/219 - 1s - 6ms/step - accuracy: 0.8057 - loss: 0.9356 - val_accuracy: 0.7906 - val_loss: 0.4491 Epoch 95/150 219/219 - 1s - 6ms/step - accuracy: 0.8011 - loss: 0.9270 - val_accuracy: 0.7967 - val_loss: 0.4397 Epoch 96/150 219/219 - 1s - 3ms/step - accuracy: 0.8036 - loss: 0.9338 - val_accuracy: 0.7878 - val_loss: 0.4495 Epoch 97/150 219/219 - 1s - 3ms/step - accuracy: 0.8057 - loss: 0.9386 - val_accuracy: 0.7806 - val_loss: 0.4687 Epoch 98/150 219/219 - 1s - 5ms/step - accuracy: 0.7996 - loss: 0.9246 - val_accuracy: 0.8067 - val_loss: 0.4244 Epoch 99/150 219/219 - 1s - 5ms/step - accuracy: 0.8049 - loss: 0.9376 - val_accuracy: 0.7933 - val_loss: 0.4527 Epoch 100/150 219/219 - 1s - 6ms/step - accuracy: 0.7999 - loss: 0.9409 - val_accuracy: 0.7867 - val_loss: 0.4631 Epoch 101/150 219/219 - 1s - 3ms/step - accuracy: 0.8064 - loss: 0.9234 - val_accuracy: 0.7878 - val_loss: 0.4562 Epoch 102/150 219/219 - 1s - 3ms/step - accuracy: 0.8034 - loss: 0.9316 - val_accuracy: 0.7917 - val_loss: 0.4455 Epoch 103/150 219/219 - 1s - 3ms/step - accuracy: 0.8084 - loss: 0.9249 - val_accuracy: 0.7967 - val_loss: 0.4458 Epoch 104/150 219/219 - 1s - 5ms/step - accuracy: 0.8003 - loss: 0.9401 - val_accuracy: 0.7844 - val_loss: 0.4681 Epoch 105/150 219/219 - 1s - 3ms/step - accuracy: 0.8029 - loss: 0.9191 - val_accuracy: 0.7739 - val_loss: 0.4681 Epoch 106/150 219/219 - 1s - 6ms/step - accuracy: 0.8057 - loss: 0.9225 - val_accuracy: 0.7944 - val_loss: 0.4481 Epoch 107/150 219/219 - 1s - 3ms/step - accuracy: 0.7999 - loss: 0.9248 - val_accuracy: 0.7994 - val_loss: 0.4470 Epoch 108/150 219/219 - 1s - 6ms/step - accuracy: 0.8004 - loss: 0.9361 - val_accuracy: 0.7750 - val_loss: 0.4764 Epoch 109/150 219/219 - 1s - 3ms/step - accuracy: 0.8036 - loss: 0.9244 - val_accuracy: 0.7928 - val_loss: 0.4527 Epoch 110/150 219/219 - 1s - 6ms/step - accuracy: 0.8064 - loss: 0.9287 - val_accuracy: 0.7833 - val_loss: 0.4633 Epoch 111/150 219/219 - 1s - 3ms/step - accuracy: 0.7989 - loss: 0.9379 - val_accuracy: 0.8000 - val_loss: 0.4396 Epoch 112/150 219/219 - 1s - 3ms/step - accuracy: 0.8063 - loss: 0.9170 - val_accuracy: 0.7906 - val_loss: 0.4459 Epoch 113/150 219/219 - 2s - 7ms/step - accuracy: 0.8044 - loss: 0.9182 - val_accuracy: 0.7906 - val_loss: 0.4579 Epoch 114/150 219/219 - 1s - 6ms/step - accuracy: 0.8047 - loss: 0.9238 - val_accuracy: 0.7928 - val_loss: 0.4538 Epoch 115/150 219/219 - 1s - 3ms/step - accuracy: 0.8080 - loss: 0.9006 - val_accuracy: 0.7900 - val_loss: 0.4545 Epoch 116/150 219/219 - 1s - 3ms/step - accuracy: 0.8076 - loss: 0.9192 - val_accuracy: 0.7750 - val_loss: 0.4714 Epoch 117/150 219/219 - 1s - 6ms/step - accuracy: 0.8073 - loss: 0.9099 - val_accuracy: 0.7794 - val_loss: 0.4661 Epoch 118/150 219/219 - 1s - 3ms/step - accuracy: 0.8027 - loss: 0.9278 - val_accuracy: 0.7939 - val_loss: 0.4495 Epoch 119/150 219/219 - 1s - 6ms/step - accuracy: 0.8013 - loss: 0.9203 - val_accuracy: 0.7983 - val_loss: 0.4461 Epoch 120/150 219/219 - 1s - 3ms/step - accuracy: 0.7994 - loss: 0.9173 - val_accuracy: 0.7944 - val_loss: 0.4448 Epoch 121/150 219/219 - 1s - 3ms/step - accuracy: 0.8043 - loss: 0.9272 - val_accuracy: 0.7817 - val_loss: 0.4637 Epoch 122/150 219/219 - 1s - 3ms/step - accuracy: 0.7987 - loss: 0.9114 - val_accuracy: 0.8022 - val_loss: 0.4252 Epoch 123/150 219/219 - 1s - 6ms/step - accuracy: 0.8051 - loss: 0.9298 - val_accuracy: 0.7850 - val_loss: 0.4601 Epoch 124/150 219/219 - 1s - 3ms/step - accuracy: 0.7950 - loss: 0.9217 - val_accuracy: 0.7939 - val_loss: 0.4360 Epoch 125/150 219/219 - 1s - 3ms/step - accuracy: 0.8051 - loss: 0.9234 - val_accuracy: 0.7839 - val_loss: 0.4567 Epoch 126/150 219/219 - 1s - 3ms/step - accuracy: 0.8057 - loss: 0.9264 - val_accuracy: 0.7833 - val_loss: 0.4598 Epoch 127/150 219/219 - 2s - 8ms/step - accuracy: 0.8067 - loss: 0.9130 - val_accuracy: 0.7872 - val_loss: 0.4412 Epoch 128/150 219/219 - 1s - 5ms/step - accuracy: 0.8069 - loss: 0.9084 - val_accuracy: 0.7878 - val_loss: 0.4516 Epoch 129/150 219/219 - 1s - 5ms/step - accuracy: 0.7947 - loss: 0.9280 - val_accuracy: 0.7878 - val_loss: 0.4502 Epoch 130/150 219/219 - 1s - 4ms/step - accuracy: 0.8044 - loss: 0.9337 - val_accuracy: 0.7928 - val_loss: 0.4539 Epoch 131/150 219/219 - 1s - 3ms/step - accuracy: 0.7974 - loss: 0.9295 - val_accuracy: 0.7806 - val_loss: 0.4638 Epoch 132/150 219/219 - 1s - 3ms/step - accuracy: 0.8084 - loss: 0.9192 - val_accuracy: 0.8000 - val_loss: 0.4381 Epoch 133/150 219/219 - 1s - 6ms/step - accuracy: 0.8104 - loss: 0.9144 - val_accuracy: 0.7867 - val_loss: 0.4563 Epoch 134/150 219/219 - 1s - 3ms/step - accuracy: 0.8027 - loss: 0.9281 - val_accuracy: 0.7883 - val_loss: 0.4570 Epoch 135/150 219/219 - 1s - 3ms/step - accuracy: 0.8079 - loss: 0.9042 - val_accuracy: 0.7972 - val_loss: 0.4438 Epoch 136/150 219/219 - 1s - 3ms/step - accuracy: 0.8063 - loss: 0.9082 - val_accuracy: 0.8017 - val_loss: 0.4400 Epoch 137/150 219/219 - 1s - 3ms/step - accuracy: 0.8056 - loss: 0.9089 - val_accuracy: 0.7944 - val_loss: 0.4442 Epoch 138/150 219/219 - 1s - 3ms/step - accuracy: 0.8059 - loss: 0.9135 - val_accuracy: 0.8067 - val_loss: 0.4241 Epoch 139/150 219/219 - 1s - 3ms/step - accuracy: 0.8059 - loss: 0.9121 - val_accuracy: 0.7972 - val_loss: 0.4377 Epoch 140/150 219/219 - 1s - 3ms/step - accuracy: 0.8150 - loss: 0.9007 - val_accuracy: 0.7878 - val_loss: 0.4503 Epoch 141/150 219/219 - 1s - 6ms/step - accuracy: 0.8079 - loss: 0.8999 - val_accuracy: 0.7767 - val_loss: 0.4642 Epoch 142/150 219/219 - 1s - 3ms/step - accuracy: 0.8050 - loss: 0.9178 - val_accuracy: 0.7972 - val_loss: 0.4332 Epoch 143/150 219/219 - 1s - 7ms/step - accuracy: 0.8034 - loss: 0.9113 - val_accuracy: 0.7856 - val_loss: 0.4498 Epoch 144/150 219/219 - 1s - 7ms/step - accuracy: 0.8051 - loss: 0.9059 - val_accuracy: 0.7894 - val_loss: 0.4518 Epoch 145/150 219/219 - 1s - 3ms/step - accuracy: 0.8004 - loss: 0.9102 - val_accuracy: 0.7989 - val_loss: 0.4290 Epoch 146/150 219/219 - 1s - 6ms/step - accuracy: 0.8077 - loss: 0.9091 - val_accuracy: 0.7911 - val_loss: 0.4488 Epoch 147/150 219/219 - 1s - 6ms/step - accuracy: 0.8081 - loss: 0.8956 - val_accuracy: 0.7894 - val_loss: 0.4518 Epoch 148/150 219/219 - 1s - 3ms/step - accuracy: 0.7994 - loss: 0.9083 - val_accuracy: 0.7961 - val_loss: 0.4396 Epoch 149/150 219/219 - 2s - 8ms/step - accuracy: 0.8100 - loss: 0.9098 - val_accuracy: 0.7833 - val_loss: 0.4541 Epoch 150/150 219/219 - 1s - 4ms/step - accuracy: 0.8063 - loss: 0.9059 - val_accuracy: 0.7906 - val_loss: 0.4502
#plot accuracy
plot(history14,'accuracy')
#plot loss
plot(history14,'loss')
#compararing all models
results.loc[14] = [2,[64,32],['relu','tanh'],150,32,"adam",['-', "-"],'xavier',["dropout:0.5","dropout:0.5"],history14.history["loss"][-1],history14.history["val_loss"][-1],history14.history["accuracy"][-1],history14.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
#Observing model performance for training data
model_performance_classification(model14, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.816571 | 0.816571 | 0.858584 | 0.828844 |
#model performance for validation data
model_performance_classification(model14, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.790556 | 0.790556 | 0.833436 | 0.803812 |
Observations
- Adam with no dropout gives better performance than with dropout.
Adam with learning rate
#clear keras session
tf.keras.backend.clear_session()
#define model
model15 = Sequential()
model15.add(Dense(64,activation="tanh",input_dim=X_train.shape[1]))
model15.add(Dropout(0.2))
model15.add(Dense(32,activation="relu"))
model15.add(Dropout(0.1))
model15.add(Dense(1,activation="sigmoid"))
#summary model
model15.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout (Dropout) │ (None, 64) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout_1 (Dropout) │ (None, 32) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#compile model with learning rate
optimizer = tf.keras.optimizers.Adam(learning_rate=0.01)
model15.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training
start = time.time()
history15 = model15.fit(X_train, y_train, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 64)
Epoch 1/150 110/110 - 3s - 24ms/step - accuracy: 0.6936 - loss: 1.1894 - val_accuracy: 0.7428 - val_loss: 0.5318 Epoch 2/150 110/110 - 1s - 9ms/step - accuracy: 0.7209 - loss: 1.1238 - val_accuracy: 0.7606 - val_loss: 0.4887 Epoch 3/150 110/110 - 1s - 6ms/step - accuracy: 0.7449 - loss: 1.0659 - val_accuracy: 0.7483 - val_loss: 0.5308 Epoch 4/150 110/110 - 1s - 5ms/step - accuracy: 0.7647 - loss: 1.0302 - val_accuracy: 0.7767 - val_loss: 0.4745 Epoch 5/150 110/110 - 1s - 6ms/step - accuracy: 0.7651 - loss: 1.0248 - val_accuracy: 0.8089 - val_loss: 0.4338 Epoch 6/150 110/110 - 1s - 5ms/step - accuracy: 0.7839 - loss: 0.9963 - val_accuracy: 0.7956 - val_loss: 0.4384 Epoch 7/150 110/110 - 0s - 3ms/step - accuracy: 0.7893 - loss: 0.9663 - val_accuracy: 0.7550 - val_loss: 0.4948 Epoch 8/150 110/110 - 1s - 5ms/step - accuracy: 0.7864 - loss: 0.9750 - val_accuracy: 0.7856 - val_loss: 0.4719 Epoch 9/150 110/110 - 1s - 6ms/step - accuracy: 0.7876 - loss: 0.9672 - val_accuracy: 0.7778 - val_loss: 0.4860 Epoch 10/150 110/110 - 1s - 6ms/step - accuracy: 0.7909 - loss: 0.9665 - val_accuracy: 0.7822 - val_loss: 0.4777 Epoch 11/150 110/110 - 1s - 5ms/step - accuracy: 0.7961 - loss: 0.9755 - val_accuracy: 0.7817 - val_loss: 0.4741 Epoch 12/150 110/110 - 1s - 6ms/step - accuracy: 0.7923 - loss: 0.9571 - val_accuracy: 0.8278 - val_loss: 0.4299 Epoch 13/150 110/110 - 1s - 5ms/step - accuracy: 0.7914 - loss: 0.9682 - val_accuracy: 0.8056 - val_loss: 0.4401 Epoch 14/150 110/110 - 1s - 6ms/step - accuracy: 0.7929 - loss: 0.9699 - val_accuracy: 0.8183 - val_loss: 0.4302 Epoch 15/150 110/110 - 1s - 6ms/step - accuracy: 0.7953 - loss: 0.9683 - val_accuracy: 0.8261 - val_loss: 0.4266 Epoch 16/150 110/110 - 1s - 6ms/step - accuracy: 0.8014 - loss: 0.9475 - val_accuracy: 0.7989 - val_loss: 0.4386 Epoch 17/150 110/110 - 1s - 8ms/step - accuracy: 0.7931 - loss: 0.9470 - val_accuracy: 0.7839 - val_loss: 0.4826 Epoch 18/150 110/110 - 1s - 6ms/step - accuracy: 0.7996 - loss: 0.9515 - val_accuracy: 0.8150 - val_loss: 0.4225 Epoch 19/150 110/110 - 1s - 5ms/step - accuracy: 0.7991 - loss: 0.9498 - val_accuracy: 0.8067 - val_loss: 0.4389 Epoch 20/150 110/110 - 1s - 6ms/step - accuracy: 0.7973 - loss: 0.9457 - val_accuracy: 0.7611 - val_loss: 0.5122 Epoch 21/150 110/110 - 1s - 6ms/step - accuracy: 0.7980 - loss: 0.9386 - val_accuracy: 0.7900 - val_loss: 0.4674 Epoch 22/150 110/110 - 0s - 4ms/step - accuracy: 0.8010 - loss: 0.9444 - val_accuracy: 0.7956 - val_loss: 0.4430 Epoch 23/150 110/110 - 1s - 5ms/step - accuracy: 0.7901 - loss: 0.9487 - val_accuracy: 0.8072 - val_loss: 0.4338 Epoch 24/150 110/110 - 1s - 6ms/step - accuracy: 0.8016 - loss: 0.9304 - val_accuracy: 0.7833 - val_loss: 0.4676 Epoch 25/150 110/110 - 1s - 6ms/step - accuracy: 0.7977 - loss: 0.9485 - val_accuracy: 0.7939 - val_loss: 0.4472 Epoch 26/150 110/110 - 1s - 6ms/step - accuracy: 0.7959 - loss: 0.9320 - val_accuracy: 0.7944 - val_loss: 0.4376 Epoch 27/150 110/110 - 1s - 6ms/step - accuracy: 0.8000 - loss: 0.9361 - val_accuracy: 0.8056 - val_loss: 0.4475 Epoch 28/150 110/110 - 1s - 6ms/step - accuracy: 0.8017 - loss: 0.9411 - val_accuracy: 0.7967 - val_loss: 0.4520 Epoch 29/150 110/110 - 1s - 6ms/step - accuracy: 0.8033 - loss: 0.9347 - val_accuracy: 0.7967 - val_loss: 0.4693 Epoch 30/150 110/110 - 1s - 6ms/step - accuracy: 0.7941 - loss: 0.9547 - val_accuracy: 0.8067 - val_loss: 0.4550 Epoch 31/150 110/110 - 0s - 3ms/step - accuracy: 0.7970 - loss: 0.9422 - val_accuracy: 0.7961 - val_loss: 0.4343 Epoch 32/150 110/110 - 1s - 6ms/step - accuracy: 0.8037 - loss: 0.9273 - val_accuracy: 0.7633 - val_loss: 0.5033 Epoch 33/150 110/110 - 0s - 4ms/step - accuracy: 0.7989 - loss: 0.9282 - val_accuracy: 0.8094 - val_loss: 0.4198 Epoch 34/150 110/110 - 1s - 5ms/step - accuracy: 0.7971 - loss: 0.9342 - val_accuracy: 0.7933 - val_loss: 0.4873 Epoch 35/150 110/110 - 0s - 3ms/step - accuracy: 0.8049 - loss: 0.9377 - val_accuracy: 0.8072 - val_loss: 0.4287 Epoch 36/150 110/110 - 1s - 6ms/step - accuracy: 0.8023 - loss: 0.9367 - val_accuracy: 0.8100 - val_loss: 0.4172 Epoch 37/150 110/110 - 1s - 6ms/step - accuracy: 0.8017 - loss: 0.9282 - val_accuracy: 0.7606 - val_loss: 0.5107 Epoch 38/150 110/110 - 1s - 5ms/step - accuracy: 0.8026 - loss: 0.9290 - val_accuracy: 0.8100 - val_loss: 0.4339 Epoch 39/150 110/110 - 1s - 5ms/step - accuracy: 0.8051 - loss: 0.9208 - val_accuracy: 0.7828 - val_loss: 0.4601 Epoch 40/150 110/110 - 1s - 7ms/step - accuracy: 0.8049 - loss: 0.9285 - val_accuracy: 0.8133 - val_loss: 0.4067 Epoch 41/150 110/110 - 1s - 6ms/step - accuracy: 0.8053 - loss: 0.9135 - val_accuracy: 0.7911 - val_loss: 0.4656 Epoch 42/150 110/110 - 1s - 6ms/step - accuracy: 0.8034 - loss: 0.9180 - val_accuracy: 0.7939 - val_loss: 0.4358 Epoch 43/150 110/110 - 1s - 6ms/step - accuracy: 0.8056 - loss: 0.9138 - val_accuracy: 0.7822 - val_loss: 0.4736 Epoch 44/150 110/110 - 1s - 9ms/step - accuracy: 0.8097 - loss: 0.9113 - val_accuracy: 0.8100 - val_loss: 0.4201 Epoch 45/150 110/110 - 1s - 6ms/step - accuracy: 0.7999 - loss: 0.9245 - val_accuracy: 0.8050 - val_loss: 0.4173 Epoch 46/150 110/110 - 1s - 5ms/step - accuracy: 0.8036 - loss: 0.9239 - val_accuracy: 0.7994 - val_loss: 0.4298 Epoch 47/150 110/110 - 1s - 6ms/step - accuracy: 0.8049 - loss: 0.9207 - val_accuracy: 0.7800 - val_loss: 0.4643 Epoch 48/150 110/110 - 1s - 5ms/step - accuracy: 0.8080 - loss: 0.9096 - val_accuracy: 0.8050 - val_loss: 0.4453 Epoch 49/150 110/110 - 0s - 4ms/step - accuracy: 0.8104 - loss: 0.9055 - val_accuracy: 0.8139 - val_loss: 0.4086 Epoch 50/150 110/110 - 1s - 5ms/step - accuracy: 0.8043 - loss: 0.8974 - val_accuracy: 0.8306 - val_loss: 0.3765 Epoch 51/150 110/110 - 0s - 3ms/step - accuracy: 0.8104 - loss: 0.9063 - val_accuracy: 0.8039 - val_loss: 0.4182 Epoch 52/150 110/110 - 0s - 3ms/step - accuracy: 0.8090 - loss: 0.9053 - val_accuracy: 0.7950 - val_loss: 0.4567 Epoch 53/150 110/110 - 1s - 6ms/step - accuracy: 0.8051 - loss: 0.9074 - val_accuracy: 0.8106 - val_loss: 0.4605 Epoch 54/150 110/110 - 0s - 3ms/step - accuracy: 0.8140 - loss: 0.8814 - val_accuracy: 0.8050 - val_loss: 0.4637 Epoch 55/150 110/110 - 1s - 6ms/step - accuracy: 0.8096 - loss: 0.9183 - val_accuracy: 0.8078 - val_loss: 0.4527 Epoch 56/150 110/110 - 0s - 4ms/step - accuracy: 0.8156 - loss: 0.8913 - val_accuracy: 0.8161 - val_loss: 0.4214 Epoch 57/150 110/110 - 1s - 6ms/step - accuracy: 0.8093 - loss: 0.8993 - val_accuracy: 0.7994 - val_loss: 0.4404 Epoch 58/150 110/110 - 1s - 5ms/step - accuracy: 0.8069 - loss: 0.8945 - val_accuracy: 0.7822 - val_loss: 0.4883 Epoch 59/150 110/110 - 1s - 6ms/step - accuracy: 0.8199 - loss: 0.8795 - val_accuracy: 0.7672 - val_loss: 0.4995 Epoch 60/150 110/110 - 0s - 3ms/step - accuracy: 0.8113 - loss: 0.8941 - val_accuracy: 0.7889 - val_loss: 0.4620 Epoch 61/150 110/110 - 0s - 4ms/step - accuracy: 0.8110 - loss: 0.8839 - val_accuracy: 0.7828 - val_loss: 0.4575 Epoch 62/150 110/110 - 1s - 5ms/step - accuracy: 0.8060 - loss: 0.8847 - val_accuracy: 0.7989 - val_loss: 0.4265 Epoch 63/150 110/110 - 1s - 8ms/step - accuracy: 0.8100 - loss: 0.8916 - val_accuracy: 0.7661 - val_loss: 0.5047 Epoch 64/150 110/110 - 1s - 6ms/step - accuracy: 0.8143 - loss: 0.8828 - val_accuracy: 0.8256 - val_loss: 0.4235 Epoch 65/150 110/110 - 1s - 5ms/step - accuracy: 0.8160 - loss: 0.8853 - val_accuracy: 0.7972 - val_loss: 0.4436 Epoch 66/150 110/110 - 1s - 6ms/step - accuracy: 0.8167 - loss: 0.8727 - val_accuracy: 0.7872 - val_loss: 0.4783 Epoch 67/150 110/110 - 1s - 9ms/step - accuracy: 0.8080 - loss: 0.8748 - val_accuracy: 0.7994 - val_loss: 0.4477 Epoch 68/150 110/110 - 1s - 6ms/step - accuracy: 0.8101 - loss: 0.8811 - val_accuracy: 0.7944 - val_loss: 0.4689 Epoch 69/150 110/110 - 1s - 6ms/step - accuracy: 0.8103 - loss: 0.8784 - val_accuracy: 0.7950 - val_loss: 0.4501 Epoch 70/150 110/110 - 0s - 3ms/step - accuracy: 0.8183 - loss: 0.8695 - val_accuracy: 0.7694 - val_loss: 0.5016 Epoch 71/150 110/110 - 1s - 6ms/step - accuracy: 0.8111 - loss: 0.8760 - val_accuracy: 0.7894 - val_loss: 0.4601 Epoch 72/150 110/110 - 0s - 3ms/step - accuracy: 0.8166 - loss: 0.8731 - val_accuracy: 0.8017 - val_loss: 0.4462 Epoch 73/150 110/110 - 1s - 5ms/step - accuracy: 0.8166 - loss: 0.8693 - val_accuracy: 0.8039 - val_loss: 0.4260 Epoch 74/150 110/110 - 1s - 6ms/step - accuracy: 0.8157 - loss: 0.8710 - val_accuracy: 0.7961 - val_loss: 0.4471 Epoch 75/150 110/110 - 0s - 3ms/step - accuracy: 0.8140 - loss: 0.8920 - val_accuracy: 0.8072 - val_loss: 0.4277 Epoch 76/150 110/110 - 1s - 6ms/step - accuracy: 0.8147 - loss: 0.8738 - val_accuracy: 0.7900 - val_loss: 0.4594 Epoch 77/150 110/110 - 0s - 3ms/step - accuracy: 0.8126 - loss: 0.8785 - val_accuracy: 0.8356 - val_loss: 0.4028 Epoch 78/150 110/110 - 1s - 6ms/step - accuracy: 0.8196 - loss: 0.8682 - val_accuracy: 0.7933 - val_loss: 0.4619 Epoch 79/150 110/110 - 1s - 5ms/step - accuracy: 0.8201 - loss: 0.8790 - val_accuracy: 0.8144 - val_loss: 0.4081 Epoch 80/150 110/110 - 0s - 3ms/step - accuracy: 0.8147 - loss: 0.8764 - val_accuracy: 0.8178 - val_loss: 0.4100 Epoch 81/150 110/110 - 0s - 4ms/step - accuracy: 0.8149 - loss: 0.8720 - val_accuracy: 0.8094 - val_loss: 0.4140 Epoch 82/150 110/110 - 1s - 6ms/step - accuracy: 0.8131 - loss: 0.8580 - val_accuracy: 0.8050 - val_loss: 0.4123 Epoch 83/150 110/110 - 1s - 5ms/step - accuracy: 0.8171 - loss: 0.8695 - val_accuracy: 0.8144 - val_loss: 0.4184 Epoch 84/150 110/110 - 1s - 6ms/step - accuracy: 0.8234 - loss: 0.8482 - val_accuracy: 0.8022 - val_loss: 0.4273 Epoch 85/150 110/110 - 1s - 7ms/step - accuracy: 0.8130 - loss: 0.8744 - val_accuracy: 0.8300 - val_loss: 0.4023 Epoch 86/150 110/110 - 1s - 6ms/step - accuracy: 0.8171 - loss: 0.8710 - val_accuracy: 0.8161 - val_loss: 0.4185 Epoch 87/150 110/110 - 1s - 6ms/step - accuracy: 0.8203 - loss: 0.8507 - val_accuracy: 0.8211 - val_loss: 0.4211 Epoch 88/150 110/110 - 1s - 5ms/step - accuracy: 0.8244 - loss: 0.8586 - val_accuracy: 0.8039 - val_loss: 0.4393 Epoch 89/150 110/110 - 1s - 6ms/step - accuracy: 0.8226 - loss: 0.8622 - val_accuracy: 0.7878 - val_loss: 0.4625 Epoch 90/150 110/110 - 1s - 9ms/step - accuracy: 0.8194 - loss: 0.8468 - val_accuracy: 0.8011 - val_loss: 0.4568 Epoch 91/150 110/110 - 1s - 5ms/step - accuracy: 0.8247 - loss: 0.8396 - val_accuracy: 0.8022 - val_loss: 0.4569 Epoch 92/150 110/110 - 0s - 4ms/step - accuracy: 0.8256 - loss: 0.8497 - val_accuracy: 0.8128 - val_loss: 0.4209 Epoch 93/150 110/110 - 1s - 5ms/step - accuracy: 0.8246 - loss: 0.8551 - val_accuracy: 0.8078 - val_loss: 0.4382 Epoch 94/150 110/110 - 0s - 3ms/step - accuracy: 0.8233 - loss: 0.8567 - val_accuracy: 0.7833 - val_loss: 0.4723 Epoch 95/150 110/110 - 1s - 5ms/step - accuracy: 0.8194 - loss: 0.8460 - val_accuracy: 0.8167 - val_loss: 0.4190 Epoch 96/150 110/110 - 0s - 3ms/step - accuracy: 0.8211 - loss: 0.8456 - val_accuracy: 0.8150 - val_loss: 0.4376 Epoch 97/150 110/110 - 1s - 5ms/step - accuracy: 0.8230 - loss: 0.8607 - val_accuracy: 0.7972 - val_loss: 0.4466 Epoch 98/150 110/110 - 1s - 6ms/step - accuracy: 0.8210 - loss: 0.8694 - val_accuracy: 0.7906 - val_loss: 0.5006 Epoch 99/150 110/110 - 1s - 5ms/step - accuracy: 0.8229 - loss: 0.8489 - val_accuracy: 0.7994 - val_loss: 0.4549 Epoch 100/150 110/110 - 1s - 7ms/step - accuracy: 0.8250 - loss: 0.8472 - val_accuracy: 0.7961 - val_loss: 0.4570 Epoch 101/150 110/110 - 1s - 6ms/step - accuracy: 0.8144 - loss: 0.8572 - val_accuracy: 0.8011 - val_loss: 0.4436 Epoch 102/150 110/110 - 0s - 3ms/step - accuracy: 0.8309 - loss: 0.8362 - val_accuracy: 0.7872 - val_loss: 0.4799 Epoch 103/150 110/110 - 1s - 7ms/step - accuracy: 0.8270 - loss: 0.8428 - val_accuracy: 0.8061 - val_loss: 0.4429 Epoch 104/150 110/110 - 1s - 5ms/step - accuracy: 0.8237 - loss: 0.8662 - val_accuracy: 0.7922 - val_loss: 0.4673 Epoch 105/150 110/110 - 0s - 4ms/step - accuracy: 0.8236 - loss: 0.8285 - val_accuracy: 0.8011 - val_loss: 0.4492 Epoch 106/150 110/110 - 1s - 5ms/step - accuracy: 0.8234 - loss: 0.8276 - val_accuracy: 0.8017 - val_loss: 0.4290 Epoch 107/150 110/110 - 0s - 3ms/step - accuracy: 0.8240 - loss: 0.8388 - val_accuracy: 0.8033 - val_loss: 0.4443 Epoch 108/150 110/110 - 0s - 4ms/step - accuracy: 0.8221 - loss: 0.8551 - val_accuracy: 0.8172 - val_loss: 0.4314 Epoch 109/150 110/110 - 1s - 7ms/step - accuracy: 0.8297 - loss: 0.8482 - val_accuracy: 0.8028 - val_loss: 0.4442 Epoch 110/150 110/110 - 1s - 6ms/step - accuracy: 0.8274 - loss: 0.8206 - val_accuracy: 0.8078 - val_loss: 0.4317 Epoch 111/150 110/110 - 1s - 6ms/step - accuracy: 0.8230 - loss: 0.8367 - val_accuracy: 0.8200 - val_loss: 0.4284 Epoch 112/150 110/110 - 1s - 7ms/step - accuracy: 0.8269 - loss: 0.8358 - val_accuracy: 0.8011 - val_loss: 0.4618 Epoch 113/150 110/110 - 1s - 9ms/step - accuracy: 0.8284 - loss: 0.8206 - val_accuracy: 0.7744 - val_loss: 0.5125 Epoch 114/150 110/110 - 0s - 4ms/step - accuracy: 0.8314 - loss: 0.8299 - val_accuracy: 0.8044 - val_loss: 0.4456 Epoch 115/150 110/110 - 1s - 5ms/step - accuracy: 0.8240 - loss: 0.8322 - val_accuracy: 0.7828 - val_loss: 0.5025 Epoch 116/150 110/110 - 1s - 5ms/step - accuracy: 0.8227 - loss: 0.8245 - val_accuracy: 0.7950 - val_loss: 0.4445 Epoch 117/150 110/110 - 0s - 3ms/step - accuracy: 0.8276 - loss: 0.8181 - val_accuracy: 0.8050 - val_loss: 0.4391 Epoch 118/150 110/110 - 1s - 5ms/step - accuracy: 0.8283 - loss: 0.8090 - val_accuracy: 0.8128 - val_loss: 0.4185 Epoch 119/150 110/110 - 1s - 8ms/step - accuracy: 0.8311 - loss: 0.8177 - val_accuracy: 0.8056 - val_loss: 0.4447 Epoch 120/150 110/110 - 1s - 9ms/step - accuracy: 0.8211 - loss: 0.8429 - val_accuracy: 0.8072 - val_loss: 0.4493 Epoch 121/150 110/110 - 0s - 3ms/step - accuracy: 0.8256 - loss: 0.8184 - val_accuracy: 0.7978 - val_loss: 0.4530 Epoch 122/150 110/110 - 0s - 4ms/step - accuracy: 0.8274 - loss: 0.8193 - val_accuracy: 0.8094 - val_loss: 0.4372 Epoch 123/150 110/110 - 0s - 3ms/step - accuracy: 0.8251 - loss: 0.8355 - val_accuracy: 0.8022 - val_loss: 0.4481 Epoch 124/150 110/110 - 1s - 6ms/step - accuracy: 0.8294 - loss: 0.8073 - val_accuracy: 0.8094 - val_loss: 0.4559 Epoch 125/150 110/110 - 1s - 6ms/step - accuracy: 0.8327 - loss: 0.8053 - val_accuracy: 0.8067 - val_loss: 0.4355 Epoch 126/150 110/110 - 1s - 5ms/step - accuracy: 0.8296 - loss: 0.8305 - val_accuracy: 0.8028 - val_loss: 0.4543 Epoch 127/150 110/110 - 1s - 6ms/step - accuracy: 0.8246 - loss: 0.8185 - val_accuracy: 0.8067 - val_loss: 0.4409 Epoch 128/150 110/110 - 1s - 6ms/step - accuracy: 0.8280 - loss: 0.8024 - val_accuracy: 0.8094 - val_loss: 0.4386 Epoch 129/150 110/110 - 1s - 5ms/step - accuracy: 0.8331 - loss: 0.8283 - val_accuracy: 0.8078 - val_loss: 0.4698 Epoch 130/150 110/110 - 1s - 8ms/step - accuracy: 0.8294 - loss: 0.8026 - val_accuracy: 0.7922 - val_loss: 0.4735 Epoch 131/150 110/110 - 1s - 5ms/step - accuracy: 0.8241 - loss: 0.8340 - val_accuracy: 0.8139 - val_loss: 0.4194 Epoch 132/150 110/110 - 1s - 6ms/step - accuracy: 0.8250 - loss: 0.8103 - val_accuracy: 0.8128 - val_loss: 0.4325 Epoch 133/150 110/110 - 1s - 6ms/step - accuracy: 0.8333 - loss: 0.8022 - val_accuracy: 0.7728 - val_loss: 0.5162 Epoch 134/150 110/110 - 1s - 6ms/step - accuracy: 0.8223 - loss: 0.8306 - val_accuracy: 0.8072 - val_loss: 0.4513 Epoch 135/150 110/110 - 0s - 4ms/step - accuracy: 0.8397 - loss: 0.8067 - val_accuracy: 0.8028 - val_loss: 0.4568 Epoch 136/150 110/110 - 1s - 6ms/step - accuracy: 0.8247 - loss: 0.8104 - val_accuracy: 0.8233 - val_loss: 0.4335 Epoch 137/150 110/110 - 1s - 6ms/step - accuracy: 0.8300 - loss: 0.8169 - val_accuracy: 0.8006 - val_loss: 0.4666 Epoch 138/150 110/110 - 1s - 5ms/step - accuracy: 0.8293 - loss: 0.7989 - val_accuracy: 0.8078 - val_loss: 0.4368 Epoch 139/150 110/110 - 1s - 5ms/step - accuracy: 0.8266 - loss: 0.8185 - val_accuracy: 0.7967 - val_loss: 0.4718 Epoch 140/150 110/110 - 1s - 6ms/step - accuracy: 0.8287 - loss: 0.8004 - val_accuracy: 0.8044 - val_loss: 0.4625 Epoch 141/150 110/110 - 1s - 6ms/step - accuracy: 0.8229 - loss: 0.8118 - val_accuracy: 0.8289 - val_loss: 0.4078 Epoch 142/150 110/110 - 0s - 3ms/step - accuracy: 0.8274 - loss: 0.8076 - val_accuracy: 0.7983 - val_loss: 0.4706 Epoch 143/150 110/110 - 0s - 3ms/step - accuracy: 0.8311 - loss: 0.8108 - val_accuracy: 0.8033 - val_loss: 0.4529 Epoch 144/150 110/110 - 0s - 4ms/step - accuracy: 0.8256 - loss: 0.8003 - val_accuracy: 0.7983 - val_loss: 0.4840 Epoch 145/150 110/110 - 1s - 6ms/step - accuracy: 0.8307 - loss: 0.7905 - val_accuracy: 0.8311 - val_loss: 0.4094 Epoch 146/150 110/110 - 0s - 4ms/step - accuracy: 0.8314 - loss: 0.8006 - val_accuracy: 0.8222 - val_loss: 0.4264 Epoch 147/150 110/110 - 0s - 3ms/step - accuracy: 0.8350 - loss: 0.7981 - val_accuracy: 0.7989 - val_loss: 0.4645 Epoch 148/150 110/110 - 0s - 4ms/step - accuracy: 0.8323 - loss: 0.8125 - val_accuracy: 0.8183 - val_loss: 0.4448 Epoch 149/150 110/110 - 1s - 5ms/step - accuracy: 0.8284 - loss: 0.8055 - val_accuracy: 0.8083 - val_loss: 0.4488 Epoch 150/150 110/110 - 1s - 6ms/step - accuracy: 0.8317 - loss: 0.8123 - val_accuracy: 0.8039 - val_loss: 0.4747
#Plot Accuracy curve
plot(history15,'accuracy')
#PLot loss curve
plot(history15,'loss')
#Comparing model with other models
results.loc[15] = [2,[64,32],['tanh','relu'],150,64,"adam",[0.01,'-'],'xavier',["dropout-0.2","dropout-0.1"],history15.history["loss"][-1],history15.history["val_loss"][-1],history15.history["accuracy"][-1],history15.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
| 15 | 2 | [64, 32] | [tanh, relu] | 150 | 64 | adam | [0.01, -] | xavier | [dropout-0.2, dropout-0.1] | 0.812251 | 0.474700 | 0.831714 | 0.803889 | -7129.34 |
#model performance for training datset
model_performance_classification(model15, X_train, y_train)
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.853714 | 0.853714 | 0.882311 | 0.861997 |
#model performance with validation dataset
model_performance_classification(model15, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.803889 | 0.803889 | 0.835261 | 0.814333 |
Observations
- With this model, model accuracy and recall improved for training but not for validatation.Still model with adam without dropout works better.
Neural Network with Balanced Data (by applying SMOTE) and Adam Optimizer¶
Define SMOTE for balanced data
#Define SMOTE for balanced data
# Synthetic Minority Over Sampling Technique
from imblearn.over_sampling import SMOTE
sm = SMOTE(sampling_strategy=1, k_neighbors=5, random_state=1)
X_train_over, y_train_over = sm.fit_resample(X_train, y_train)
X_val_over, y_val_over = sm.fit_resample(X_val, y_val)
X_test_over, y_test_over = sm.fit_resample(X_test, y_test)
print("Before OverSampling, count of label '1': {}".format(sum(y_train['Exited'] == 1)))
print("Before OverSampling, count of label '0': {} \n".format(sum(y_train['Exited'] == 0)))
print("After OverSampling, count of label '1': {}".format(sum(y_train_over['Exited'] == 1)))
print("After OverSampling, count of label '0': {} \n".format(sum(y_train_over['Exited'] == 0)))
print("After OverSampling, the shape of train_X: {}".format(X_train_over.shape))
print("After OverSampling, the shape of train_y: {} \n".format(y_train_over.shape))
Before OverSampling, count of label '1': 1410 Before OverSampling, count of label '0': 5590 After OverSampling, count of label '1': 5590 After OverSampling, count of label '0': 5590 After OverSampling, the shape of train_X: (11180, 11) After OverSampling, the shape of train_y: (11180, 1)
print("Before OverSampling, count of label '1': {}".format(sum(y_val['Exited'] == 1)))
print("Before OverSampling, count of label '0': {} \n".format(sum(y_val['Exited'] == 0)))
print("After OverSampling, count of label '1': {}".format(sum(y_val_over['Exited'] == 1)))
print("After OverSampling, count of label '0': {} \n".format(sum(y_val_over['Exited'] == 0)))
print("After OverSampling, the shape of train_X: {}".format(X_val_over.shape))
print("After OverSampling, the shape of train_y: {} \n".format(y_val_over.shape))
Before OverSampling, count of label '1': 377 Before OverSampling, count of label '0': 1423 After OverSampling, count of label '1': 1423 After OverSampling, count of label '0': 1423 After OverSampling, the shape of train_X: (2846, 11) After OverSampling, the shape of train_y: (2846, 1)
print("Before OverSampling, count of label '1': {}".format(sum(y_test['Exited'] == 1)))
print("Before OverSampling, count of label '0': {} \n".format(sum(y_val['Exited'] == 0)))
print("After OverSampling, count of label '1': {}".format(sum(y_test_over['Exited'] == 1)))
print("After OverSampling, count of label '0': {} \n".format(sum(y_test_over['Exited'] == 0)))
print("After OverSampling, the shape of train_X: {}".format(X_test_over.shape))
print("After OverSampling, the shape of train_y: {} \n".format(y_test_over.shape))
Before OverSampling, count of label '1': 250 Before OverSampling, count of label '0': 1423 After OverSampling, count of label '1': 950 After OverSampling, count of label '0': 950 After OverSampling, the shape of train_X: (1900, 11) After OverSampling, the shape of train_y: (1900, 1)
#clear keras session
tf.keras.backend.clear_session()
#define model
model16 = Sequential()
model16.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model16.add(Dense(32,activation="tanh"))
model16.add(Dense(1,activation="sigmoid"))
#model summary
model16.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#compile model
optimizer = tf.keras.optimizers.Adam() # defining SGD as the optimizer to be used
model16.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
# Compute class weights using compute_class_weight
from sklearn.utils.class_weight import compute_class_weight
# Ensure y_train_over is a 1D array-like and extract unique class labels
y_train_over_values = y_train_over['Exited'].values # Assuming 'Exited' is the target column
unique_classes = np.unique(y_train_over_values)
# Calculate class weights
class_weights = compute_class_weight(class_weight='balanced', classes=unique_classes, y=y_train_over_values)
class_weight_dict = dict(enumerate(class_weights))
#start training
start = time.time()
history16 = model16.fit(X_train_over, y_train_over, validation_data=(X_val_over,y_val_over),epochs=150,class_weight=class_weight_dict, verbose=2,batch_size = 32)
Epoch 1/150 350/350 - 3s - 8ms/step - accuracy: 0.7178 - loss: 0.5519 - val_accuracy: 0.7583 - val_loss: 0.4972 Epoch 2/150 350/350 - 2s - 6ms/step - accuracy: 0.7541 - loss: 0.5021 - val_accuracy: 0.7727 - val_loss: 0.4752 Epoch 3/150 350/350 - 1s - 3ms/step - accuracy: 0.7747 - loss: 0.4741 - val_accuracy: 0.7836 - val_loss: 0.4553 Epoch 4/150 350/350 - 1s - 2ms/step - accuracy: 0.7868 - loss: 0.4544 - val_accuracy: 0.7758 - val_loss: 0.4497 Epoch 5/150 350/350 - 1s - 4ms/step - accuracy: 0.7933 - loss: 0.4411 - val_accuracy: 0.7860 - val_loss: 0.4411 Epoch 6/150 350/350 - 1s - 2ms/step - accuracy: 0.8003 - loss: 0.4301 - val_accuracy: 0.7807 - val_loss: 0.4430 Epoch 7/150 350/350 - 1s - 4ms/step - accuracy: 0.8045 - loss: 0.4238 - val_accuracy: 0.7888 - val_loss: 0.4326 Epoch 8/150 350/350 - 1s - 4ms/step - accuracy: 0.8100 - loss: 0.4154 - val_accuracy: 0.7913 - val_loss: 0.4272 Epoch 9/150 350/350 - 1s - 2ms/step - accuracy: 0.8140 - loss: 0.4096 - val_accuracy: 0.7881 - val_loss: 0.4360 Epoch 10/150 350/350 - 1s - 4ms/step - accuracy: 0.8161 - loss: 0.4060 - val_accuracy: 0.7980 - val_loss: 0.4344 Epoch 11/150 350/350 - 1s - 3ms/step - accuracy: 0.8148 - loss: 0.4004 - val_accuracy: 0.7934 - val_loss: 0.4312 Epoch 12/150 350/350 - 2s - 5ms/step - accuracy: 0.8196 - loss: 0.3990 - val_accuracy: 0.7941 - val_loss: 0.4226 Epoch 13/150 350/350 - 2s - 6ms/step - accuracy: 0.8216 - loss: 0.3952 - val_accuracy: 0.7927 - val_loss: 0.4317 Epoch 14/150 350/350 - 1s - 3ms/step - accuracy: 0.8245 - loss: 0.3905 - val_accuracy: 0.7888 - val_loss: 0.4352 Epoch 15/150 350/350 - 1s - 4ms/step - accuracy: 0.8251 - loss: 0.3882 - val_accuracy: 0.7913 - val_loss: 0.4271 Epoch 16/150 350/350 - 1s - 2ms/step - accuracy: 0.8302 - loss: 0.3842 - val_accuracy: 0.7916 - val_loss: 0.4263 Epoch 17/150 350/350 - 1s - 4ms/step - accuracy: 0.8284 - loss: 0.3813 - val_accuracy: 0.7892 - val_loss: 0.4256 Epoch 18/150 350/350 - 1s - 4ms/step - accuracy: 0.8320 - loss: 0.3796 - val_accuracy: 0.7909 - val_loss: 0.4326 Epoch 19/150 350/350 - 1s - 3ms/step - accuracy: 0.8284 - loss: 0.3769 - val_accuracy: 0.7973 - val_loss: 0.4297 Epoch 20/150 350/350 - 1s - 2ms/step - accuracy: 0.8358 - loss: 0.3726 - val_accuracy: 0.7916 - val_loss: 0.4278 Epoch 21/150 350/350 - 1s - 2ms/step - accuracy: 0.8348 - loss: 0.3709 - val_accuracy: 0.7909 - val_loss: 0.4390 Epoch 22/150 350/350 - 2s - 5ms/step - accuracy: 0.8377 - loss: 0.3701 - val_accuracy: 0.7934 - val_loss: 0.4363 Epoch 23/150 350/350 - 2s - 6ms/step - accuracy: 0.8411 - loss: 0.3661 - val_accuracy: 0.7895 - val_loss: 0.4340 Epoch 24/150 350/350 - 1s - 4ms/step - accuracy: 0.8401 - loss: 0.3666 - val_accuracy: 0.7934 - val_loss: 0.4430 Epoch 25/150 350/350 - 1s - 3ms/step - accuracy: 0.8416 - loss: 0.3614 - val_accuracy: 0.7923 - val_loss: 0.4427 Epoch 26/150 350/350 - 1s - 4ms/step - accuracy: 0.8416 - loss: 0.3607 - val_accuracy: 0.7927 - val_loss: 0.4408 Epoch 27/150 350/350 - 1s - 4ms/step - accuracy: 0.8462 - loss: 0.3558 - val_accuracy: 0.7832 - val_loss: 0.4555 Epoch 28/150 350/350 - 1s - 2ms/step - accuracy: 0.8470 - loss: 0.3541 - val_accuracy: 0.8032 - val_loss: 0.4317 Epoch 29/150 350/350 - 1s - 4ms/step - accuracy: 0.8481 - loss: 0.3509 - val_accuracy: 0.8004 - val_loss: 0.4389 Epoch 30/150 350/350 - 1s - 3ms/step - accuracy: 0.8481 - loss: 0.3497 - val_accuracy: 0.8032 - val_loss: 0.4373 Epoch 31/150 350/350 - 1s - 3ms/step - accuracy: 0.8496 - loss: 0.3484 - val_accuracy: 0.7952 - val_loss: 0.4544 Epoch 32/150 350/350 - 2s - 5ms/step - accuracy: 0.8490 - loss: 0.3465 - val_accuracy: 0.7829 - val_loss: 0.4753 Epoch 33/150 350/350 - 2s - 6ms/step - accuracy: 0.8480 - loss: 0.3468 - val_accuracy: 0.7973 - val_loss: 0.4444 Epoch 34/150 350/350 - 1s - 3ms/step - accuracy: 0.8522 - loss: 0.3412 - val_accuracy: 0.7959 - val_loss: 0.4512 Epoch 35/150 350/350 - 1s - 2ms/step - accuracy: 0.8523 - loss: 0.3405 - val_accuracy: 0.8018 - val_loss: 0.4451 Epoch 36/150 350/350 - 1s - 4ms/step - accuracy: 0.8535 - loss: 0.3411 - val_accuracy: 0.7944 - val_loss: 0.4424 Epoch 37/150 350/350 - 1s - 2ms/step - accuracy: 0.8560 - loss: 0.3348 - val_accuracy: 0.7822 - val_loss: 0.4618 Epoch 38/150 350/350 - 1s - 4ms/step - accuracy: 0.8555 - loss: 0.3357 - val_accuracy: 0.7923 - val_loss: 0.4704 Epoch 39/150 350/350 - 1s - 2ms/step - accuracy: 0.8582 - loss: 0.3323 - val_accuracy: 0.7941 - val_loss: 0.4632 Epoch 40/150 350/350 - 1s - 4ms/step - accuracy: 0.8589 - loss: 0.3306 - val_accuracy: 0.7976 - val_loss: 0.4587 Epoch 41/150 350/350 - 1s - 2ms/step - accuracy: 0.8622 - loss: 0.3270 - val_accuracy: 0.7825 - val_loss: 0.4826 Epoch 42/150 350/350 - 1s - 2ms/step - accuracy: 0.8600 - loss: 0.3255 - val_accuracy: 0.7906 - val_loss: 0.4596 Epoch 43/150 350/350 - 1s - 3ms/step - accuracy: 0.8632 - loss: 0.3251 - val_accuracy: 0.7923 - val_loss: 0.4514 Epoch 44/150 350/350 - 1s - 4ms/step - accuracy: 0.8647 - loss: 0.3226 - val_accuracy: 0.7822 - val_loss: 0.4694 Epoch 45/150 350/350 - 2s - 6ms/step - accuracy: 0.8658 - loss: 0.3199 - val_accuracy: 0.7814 - val_loss: 0.4793 Epoch 46/150 350/350 - 1s - 3ms/step - accuracy: 0.8646 - loss: 0.3214 - val_accuracy: 0.7885 - val_loss: 0.4773 Epoch 47/150 350/350 - 1s - 4ms/step - accuracy: 0.8657 - loss: 0.3171 - val_accuracy: 0.7800 - val_loss: 0.4960 Epoch 48/150 350/350 - 1s - 4ms/step - accuracy: 0.8652 - loss: 0.3150 - val_accuracy: 0.7864 - val_loss: 0.4652 Epoch 49/150 350/350 - 1s - 3ms/step - accuracy: 0.8681 - loss: 0.3154 - val_accuracy: 0.7885 - val_loss: 0.4632 Epoch 50/150 350/350 - 1s - 4ms/step - accuracy: 0.8699 - loss: 0.3123 - val_accuracy: 0.7814 - val_loss: 0.4701 Epoch 51/150 350/350 - 1s - 3ms/step - accuracy: 0.8674 - loss: 0.3092 - val_accuracy: 0.7853 - val_loss: 0.4792 Epoch 52/150 350/350 - 1s - 4ms/step - accuracy: 0.8708 - loss: 0.3109 - val_accuracy: 0.7895 - val_loss: 0.4726 Epoch 53/150 350/350 - 1s - 3ms/step - accuracy: 0.8750 - loss: 0.3040 - val_accuracy: 0.7892 - val_loss: 0.4755 Epoch 54/150 350/350 - 2s - 4ms/step - accuracy: 0.8704 - loss: 0.3078 - val_accuracy: 0.7888 - val_loss: 0.4801 Epoch 55/150 350/350 - 2s - 6ms/step - accuracy: 0.8739 - loss: 0.3022 - val_accuracy: 0.7825 - val_loss: 0.4809 Epoch 56/150 350/350 - 1s - 4ms/step - accuracy: 0.8742 - loss: 0.3030 - val_accuracy: 0.7723 - val_loss: 0.5304 Epoch 57/150 350/350 - 1s - 3ms/step - accuracy: 0.8767 - loss: 0.2982 - val_accuracy: 0.7807 - val_loss: 0.5039 Epoch 58/150 350/350 - 1s - 2ms/step - accuracy: 0.8775 - loss: 0.2987 - val_accuracy: 0.7906 - val_loss: 0.4939 Epoch 59/150 350/350 - 1s - 4ms/step - accuracy: 0.8799 - loss: 0.2941 - val_accuracy: 0.7814 - val_loss: 0.5228 Epoch 60/150 350/350 - 1s - 3ms/step - accuracy: 0.8767 - loss: 0.2931 - val_accuracy: 0.7762 - val_loss: 0.4916 Epoch 61/150 350/350 - 1s - 4ms/step - accuracy: 0.8787 - loss: 0.2903 - val_accuracy: 0.7776 - val_loss: 0.5050 Epoch 62/150 350/350 - 1s - 3ms/step - accuracy: 0.8805 - loss: 0.2896 - val_accuracy: 0.7807 - val_loss: 0.4976 Epoch 63/150 350/350 - 1s - 4ms/step - accuracy: 0.8796 - loss: 0.2888 - val_accuracy: 0.7832 - val_loss: 0.5052 Epoch 64/150 350/350 - 2s - 5ms/step - accuracy: 0.8797 - loss: 0.2883 - val_accuracy: 0.7706 - val_loss: 0.5444 Epoch 65/150 350/350 - 2s - 6ms/step - accuracy: 0.8811 - loss: 0.2860 - val_accuracy: 0.7758 - val_loss: 0.5181 Epoch 66/150 350/350 - 1s - 2ms/step - accuracy: 0.8810 - loss: 0.2824 - val_accuracy: 0.7797 - val_loss: 0.5084 Epoch 67/150 350/350 - 1s - 2ms/step - accuracy: 0.8850 - loss: 0.2794 - val_accuracy: 0.7846 - val_loss: 0.5074 Epoch 68/150 350/350 - 1s - 2ms/step - accuracy: 0.8824 - loss: 0.2818 - val_accuracy: 0.7839 - val_loss: 0.5045 Epoch 69/150 350/350 - 1s - 2ms/step - accuracy: 0.8856 - loss: 0.2770 - val_accuracy: 0.7811 - val_loss: 0.5024 Epoch 70/150 350/350 - 1s - 4ms/step - accuracy: 0.8860 - loss: 0.2786 - val_accuracy: 0.7804 - val_loss: 0.5040 Epoch 71/150 350/350 - 1s - 2ms/step - accuracy: 0.8844 - loss: 0.2780 - val_accuracy: 0.7744 - val_loss: 0.5150 Epoch 72/150 350/350 - 1s - 4ms/step - accuracy: 0.8893 - loss: 0.2749 - val_accuracy: 0.7822 - val_loss: 0.5399 Epoch 73/150 350/350 - 1s - 3ms/step - accuracy: 0.8868 - loss: 0.2722 - val_accuracy: 0.7663 - val_loss: 0.5480 Epoch 74/150 350/350 - 1s - 3ms/step - accuracy: 0.8873 - loss: 0.2717 - val_accuracy: 0.7734 - val_loss: 0.5277 Epoch 75/150 350/350 - 2s - 4ms/step - accuracy: 0.8896 - loss: 0.2710 - val_accuracy: 0.7713 - val_loss: 0.5484 Epoch 76/150 350/350 - 2s - 6ms/step - accuracy: 0.8895 - loss: 0.2696 - val_accuracy: 0.7797 - val_loss: 0.5255 Epoch 77/150 350/350 - 1s - 4ms/step - accuracy: 0.8939 - loss: 0.2673 - val_accuracy: 0.7699 - val_loss: 0.5477 Epoch 78/150 350/350 - 1s - 4ms/step - accuracy: 0.8924 - loss: 0.2647 - val_accuracy: 0.7677 - val_loss: 0.5471 Epoch 79/150 350/350 - 1s - 4ms/step - accuracy: 0.8913 - loss: 0.2660 - val_accuracy: 0.7734 - val_loss: 0.5263 Epoch 80/150 350/350 - 1s - 4ms/step - accuracy: 0.8913 - loss: 0.2633 - val_accuracy: 0.7857 - val_loss: 0.5215 Epoch 81/150 350/350 - 1s - 3ms/step - accuracy: 0.8949 - loss: 0.2606 - val_accuracy: 0.7677 - val_loss: 0.5368 Epoch 82/150 350/350 - 1s - 4ms/step - accuracy: 0.8955 - loss: 0.2604 - val_accuracy: 0.7727 - val_loss: 0.5543 Epoch 83/150 350/350 - 1s - 4ms/step - accuracy: 0.8962 - loss: 0.2565 - val_accuracy: 0.7625 - val_loss: 0.5752 Epoch 84/150 350/350 - 2s - 5ms/step - accuracy: 0.8980 - loss: 0.2556 - val_accuracy: 0.7734 - val_loss: 0.5629 Epoch 85/150 350/350 - 2s - 6ms/step - accuracy: 0.8970 - loss: 0.2559 - val_accuracy: 0.7720 - val_loss: 0.5532 Epoch 86/150 350/350 - 1s - 3ms/step - accuracy: 0.8950 - loss: 0.2561 - val_accuracy: 0.7811 - val_loss: 0.5270 Epoch 87/150 350/350 - 1s - 3ms/step - accuracy: 0.8968 - loss: 0.2533 - val_accuracy: 0.7646 - val_loss: 0.5704 Epoch 88/150 350/350 - 1s - 3ms/step - accuracy: 0.8973 - loss: 0.2532 - val_accuracy: 0.7667 - val_loss: 0.5758 Epoch 89/150 350/350 - 1s - 4ms/step - accuracy: 0.8976 - loss: 0.2503 - val_accuracy: 0.7776 - val_loss: 0.5455 Epoch 90/150 350/350 - 1s - 4ms/step - accuracy: 0.9002 - loss: 0.2487 - val_accuracy: 0.7769 - val_loss: 0.5753 Epoch 91/150 350/350 - 1s - 4ms/step - accuracy: 0.8985 - loss: 0.2497 - val_accuracy: 0.7684 - val_loss: 0.5707 Epoch 92/150 350/350 - 1s - 4ms/step - accuracy: 0.9013 - loss: 0.2470 - val_accuracy: 0.7537 - val_loss: 0.6008 Epoch 93/150 350/350 - 2s - 5ms/step - accuracy: 0.9004 - loss: 0.2475 - val_accuracy: 0.7565 - val_loss: 0.6077 Epoch 94/150 350/350 - 1s - 4ms/step - accuracy: 0.9016 - loss: 0.2463 - val_accuracy: 0.7632 - val_loss: 0.5940 Epoch 95/150 350/350 - 2s - 6ms/step - accuracy: 0.9010 - loss: 0.2441 - val_accuracy: 0.7544 - val_loss: 0.6212 Epoch 96/150 350/350 - 1s - 2ms/step - accuracy: 0.9023 - loss: 0.2414 - val_accuracy: 0.7699 - val_loss: 0.5972 Epoch 97/150 350/350 - 1s - 4ms/step - accuracy: 0.9044 - loss: 0.2411 - val_accuracy: 0.7495 - val_loss: 0.6233 Epoch 98/150 350/350 - 1s - 2ms/step - accuracy: 0.9047 - loss: 0.2401 - val_accuracy: 0.7583 - val_loss: 0.6158 Epoch 99/150 350/350 - 1s - 4ms/step - accuracy: 0.9038 - loss: 0.2369 - val_accuracy: 0.7586 - val_loss: 0.6097 Epoch 100/150 350/350 - 1s - 4ms/step - accuracy: 0.9069 - loss: 0.2370 - val_accuracy: 0.7646 - val_loss: 0.6093 Epoch 101/150 350/350 - 1s - 3ms/step - accuracy: 0.9053 - loss: 0.2358 - val_accuracy: 0.7720 - val_loss: 0.5878 Epoch 102/150 350/350 - 1s - 4ms/step - accuracy: 0.9072 - loss: 0.2317 - val_accuracy: 0.7621 - val_loss: 0.6105 Epoch 103/150 350/350 - 2s - 5ms/step - accuracy: 0.9097 - loss: 0.2323 - val_accuracy: 0.7663 - val_loss: 0.6198 Epoch 104/150 350/350 - 1s - 4ms/step - accuracy: 0.9097 - loss: 0.2318 - val_accuracy: 0.7544 - val_loss: 0.6716 Epoch 105/150 350/350 - 2s - 6ms/step - accuracy: 0.9095 - loss: 0.2297 - val_accuracy: 0.7561 - val_loss: 0.6323 Epoch 106/150 350/350 - 1s - 3ms/step - accuracy: 0.9112 - loss: 0.2289 - val_accuracy: 0.7699 - val_loss: 0.5905 Epoch 107/150 350/350 - 1s - 3ms/step - accuracy: 0.9118 - loss: 0.2269 - val_accuracy: 0.7660 - val_loss: 0.5823 Epoch 108/150 350/350 - 1s - 4ms/step - accuracy: 0.9103 - loss: 0.2261 - val_accuracy: 0.7618 - val_loss: 0.6167 Epoch 109/150 350/350 - 1s - 4ms/step - accuracy: 0.9105 - loss: 0.2270 - val_accuracy: 0.7576 - val_loss: 0.6685 Epoch 110/150 350/350 - 1s - 3ms/step - accuracy: 0.9140 - loss: 0.2231 - val_accuracy: 0.7614 - val_loss: 0.6192 Epoch 111/150 350/350 - 1s - 3ms/step - accuracy: 0.9137 - loss: 0.2216 - val_accuracy: 0.7621 - val_loss: 0.6236 Epoch 112/150 350/350 - 1s - 4ms/step - accuracy: 0.9128 - loss: 0.2223 - val_accuracy: 0.7681 - val_loss: 0.6341 Epoch 113/150 350/350 - 1s - 3ms/step - accuracy: 0.9127 - loss: 0.2188 - val_accuracy: 0.7723 - val_loss: 0.6179 Epoch 114/150 350/350 - 1s - 4ms/step - accuracy: 0.9153 - loss: 0.2179 - val_accuracy: 0.7632 - val_loss: 0.6409 Epoch 115/150 350/350 - 2s - 6ms/step - accuracy: 0.9143 - loss: 0.2172 - val_accuracy: 0.7502 - val_loss: 0.6346 Epoch 116/150 350/350 - 1s - 4ms/step - accuracy: 0.9149 - loss: 0.2172 - val_accuracy: 0.7449 - val_loss: 0.6599 Epoch 117/150 350/350 - 1s - 4ms/step - accuracy: 0.9149 - loss: 0.2189 - val_accuracy: 0.7600 - val_loss: 0.6372 Epoch 118/150 350/350 - 1s - 3ms/step - accuracy: 0.9181 - loss: 0.2141 - val_accuracy: 0.7579 - val_loss: 0.6616 Epoch 119/150 350/350 - 1s - 2ms/step - accuracy: 0.9177 - loss: 0.2144 - val_accuracy: 0.7621 - val_loss: 0.6522 Epoch 120/150 350/350 - 1s - 4ms/step - accuracy: 0.9193 - loss: 0.2113 - val_accuracy: 0.7481 - val_loss: 0.6891 Epoch 121/150 350/350 - 1s - 4ms/step - accuracy: 0.9189 - loss: 0.2105 - val_accuracy: 0.7618 - val_loss: 0.6305 Epoch 122/150 350/350 - 1s - 4ms/step - accuracy: 0.9168 - loss: 0.2099 - val_accuracy: 0.7365 - val_loss: 0.7217 Epoch 123/150 350/350 - 2s - 5ms/step - accuracy: 0.9225 - loss: 0.2071 - val_accuracy: 0.7421 - val_loss: 0.7303 Epoch 124/150 350/350 - 1s - 4ms/step - accuracy: 0.9220 - loss: 0.2074 - val_accuracy: 0.7544 - val_loss: 0.6675 Epoch 125/150 350/350 - 2s - 6ms/step - accuracy: 0.9226 - loss: 0.2043 - val_accuracy: 0.7417 - val_loss: 0.6860 Epoch 126/150 350/350 - 1s - 2ms/step - accuracy: 0.9195 - loss: 0.2074 - val_accuracy: 0.7586 - val_loss: 0.6636 Epoch 127/150 350/350 - 1s - 2ms/step - accuracy: 0.9203 - loss: 0.2067 - val_accuracy: 0.7600 - val_loss: 0.6552 Epoch 128/150 350/350 - 1s - 2ms/step - accuracy: 0.9237 - loss: 0.2027 - val_accuracy: 0.7442 - val_loss: 0.7718 Epoch 129/150 350/350 - 1s - 4ms/step - accuracy: 0.9225 - loss: 0.2033 - val_accuracy: 0.7484 - val_loss: 0.7123 Epoch 130/150 350/350 - 1s - 2ms/step - accuracy: 0.9198 - loss: 0.2011 - val_accuracy: 0.7463 - val_loss: 0.6959 Epoch 131/150 350/350 - 1s - 4ms/step - accuracy: 0.9255 - loss: 0.1998 - val_accuracy: 0.7463 - val_loss: 0.7147 Epoch 132/150 350/350 - 1s - 2ms/step - accuracy: 0.9255 - loss: 0.1999 - val_accuracy: 0.7512 - val_loss: 0.6832 Epoch 133/150 350/350 - 1s - 4ms/step - accuracy: 0.9250 - loss: 0.1982 - val_accuracy: 0.7403 - val_loss: 0.7809 Epoch 134/150 350/350 - 2s - 5ms/step - accuracy: 0.9274 - loss: 0.1967 - val_accuracy: 0.7551 - val_loss: 0.7046 Epoch 135/150 350/350 - 2s - 7ms/step - accuracy: 0.9258 - loss: 0.1964 - val_accuracy: 0.7572 - val_loss: 0.6659 Epoch 136/150 350/350 - 1s - 3ms/step - accuracy: 0.9254 - loss: 0.1946 - val_accuracy: 0.7537 - val_loss: 0.7032 Epoch 137/150 350/350 - 1s - 3ms/step - accuracy: 0.9269 - loss: 0.1956 - val_accuracy: 0.7512 - val_loss: 0.7055 Epoch 138/150 350/350 - 1s - 4ms/step - accuracy: 0.9274 - loss: 0.1901 - val_accuracy: 0.7312 - val_loss: 0.8476 Epoch 139/150 350/350 - 1s - 3ms/step - accuracy: 0.9279 - loss: 0.1920 - val_accuracy: 0.7393 - val_loss: 0.7520 Epoch 140/150 350/350 - 1s - 3ms/step - accuracy: 0.9283 - loss: 0.1889 - val_accuracy: 0.7428 - val_loss: 0.7627 Epoch 141/150 350/350 - 1s - 4ms/step - accuracy: 0.9300 - loss: 0.1893 - val_accuracy: 0.7495 - val_loss: 0.7016 Epoch 142/150 350/350 - 1s - 3ms/step - accuracy: 0.9287 - loss: 0.1890 - val_accuracy: 0.7502 - val_loss: 0.7025 Epoch 143/150 350/350 - 1s - 3ms/step - accuracy: 0.9299 - loss: 0.1877 - val_accuracy: 0.7424 - val_loss: 0.7627 Epoch 144/150 350/350 - 2s - 5ms/step - accuracy: 0.9305 - loss: 0.1880 - val_accuracy: 0.7554 - val_loss: 0.7517 Epoch 145/150 350/350 - 2s - 6ms/step - accuracy: 0.9324 - loss: 0.1848 - val_accuracy: 0.7421 - val_loss: 0.8159 Epoch 146/150 350/350 - 1s - 3ms/step - accuracy: 0.9285 - loss: 0.1841 - val_accuracy: 0.7393 - val_loss: 0.8199 Epoch 147/150 350/350 - 1s - 3ms/step - accuracy: 0.9328 - loss: 0.1822 - val_accuracy: 0.7505 - val_loss: 0.7634 Epoch 148/150 350/350 - 1s - 3ms/step - accuracy: 0.9345 - loss: 0.1829 - val_accuracy: 0.7519 - val_loss: 0.7584 Epoch 149/150 350/350 - 1s - 4ms/step - accuracy: 0.9339 - loss: 0.1803 - val_accuracy: 0.7449 - val_loss: 0.7621 Epoch 150/150 350/350 - 1s - 2ms/step - accuracy: 0.9328 - loss: 0.1811 - val_accuracy: 0.7446 - val_loss: 0.8190
#Observing Accuracy curve
plot(history16,'accuracy')
#observing loss curve
plot(history16,'loss')
#comparing model performance
results.loc[16] = [2,[64,32],['relu','tanh'],150,32,"adam",['-', "-"],"xavier",["SMOTE"],history16.history["loss"][-1],history16.history["val_loss"][-1],history16.history["accuracy"][-1],history16.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
| 15 | 2 | [64, 32] | [tanh, relu] | 150 | 64 | adam | [0.01, -] | xavier | [dropout-0.2, dropout-0.1] | 0.812251 | 0.474700 | 0.831714 | 0.803889 | -7129.34 |
| 16 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [SMOTE] | 0.181068 | 0.819016 | 0.932826 | 0.744554 | -7129.34 |
#model performance for training data
model_performance_classification(model16, X_train_over, y_train_over)
350/350 ━━━━━━━━━━━━━━━━━━━━ 1s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.939267 | 0.939267 | 0.939562 | 0.939256 |
#model performance for validation
model_performance_classification(model16, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.808333 | 0.808333 | 0.811959 | 0.810045 |
Observations
- With SMOTE oversampling Accuracy and Recall for Training data improved but not for validation.It shows overfiting.
- Loss has reduced for training but increased for validation.
Neural Network with Balanced Data (by applying SMOTE) and SGD Optimizer¶
#clear keras session
tf.keras.backend.clear_session()
#define model with dropout
model17 = Sequential()
model17.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model17.add(Dropout(0.2))
model17.add(Dense(32,activation="tanh"))
model17.add(Dropout(0.1))
model17.add(Dense(1,activation="sigmoid"))
#model summary
model17.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout (Dropout) │ (None, 64) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout_1 (Dropout) │ (None, 32) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#compile model
optimizer = tf.keras.optimizers.SGD() # defining SGD as the optimizer to be used
model17.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training model
start = time.time()
history17 = model17.fit(X_train_over, y_train_over, validation_data=(X_val_over,y_val_over),epochs=150,class_weight=class_weight_dict,verbose=2,batch_size = 64)
Epoch 1/150 175/175 - 2s - 9ms/step - accuracy: 0.5479 - loss: 0.6855 - val_accuracy: 0.7041 - val_loss: 0.6272 Epoch 2/150 175/175 - 1s - 5ms/step - accuracy: 0.6525 - loss: 0.6332 - val_accuracy: 0.7242 - val_loss: 0.5873 Epoch 3/150 175/175 - 1s - 4ms/step - accuracy: 0.6838 - loss: 0.6079 - val_accuracy: 0.7424 - val_loss: 0.5640 Epoch 4/150 175/175 - 1s - 4ms/step - accuracy: 0.6997 - loss: 0.5926 - val_accuracy: 0.7414 - val_loss: 0.5503 Epoch 5/150 175/175 - 1s - 3ms/step - accuracy: 0.7096 - loss: 0.5807 - val_accuracy: 0.7474 - val_loss: 0.5393 Epoch 6/150 175/175 - 0s - 3ms/step - accuracy: 0.7089 - loss: 0.5749 - val_accuracy: 0.7512 - val_loss: 0.5323 Epoch 7/150 175/175 - 1s - 3ms/step - accuracy: 0.7132 - loss: 0.5669 - val_accuracy: 0.7477 - val_loss: 0.5276 Epoch 8/150 175/175 - 1s - 3ms/step - accuracy: 0.7162 - loss: 0.5611 - val_accuracy: 0.7540 - val_loss: 0.5223 Epoch 9/150 175/175 - 1s - 4ms/step - accuracy: 0.7162 - loss: 0.5590 - val_accuracy: 0.7533 - val_loss: 0.5191 Epoch 10/150 175/175 - 1s - 3ms/step - accuracy: 0.7221 - loss: 0.5541 - val_accuracy: 0.7502 - val_loss: 0.5164 Epoch 11/150 175/175 - 1s - 4ms/step - accuracy: 0.7262 - loss: 0.5498 - val_accuracy: 0.7547 - val_loss: 0.5136 Epoch 12/150 175/175 - 1s - 7ms/step - accuracy: 0.7259 - loss: 0.5492 - val_accuracy: 0.7558 - val_loss: 0.5117 Epoch 13/150 175/175 - 1s - 7ms/step - accuracy: 0.7208 - loss: 0.5528 - val_accuracy: 0.7502 - val_loss: 0.5110 Epoch 14/150 175/175 - 1s - 6ms/step - accuracy: 0.7205 - loss: 0.5471 - val_accuracy: 0.7509 - val_loss: 0.5090 Epoch 15/150 175/175 - 1s - 3ms/step - accuracy: 0.7275 - loss: 0.5420 - val_accuracy: 0.7572 - val_loss: 0.5072 Epoch 16/150 175/175 - 0s - 3ms/step - accuracy: 0.7239 - loss: 0.5440 - val_accuracy: 0.7554 - val_loss: 0.5062 Epoch 17/150 175/175 - 1s - 3ms/step - accuracy: 0.7292 - loss: 0.5388 - val_accuracy: 0.7544 - val_loss: 0.5050 Epoch 18/150 175/175 - 1s - 3ms/step - accuracy: 0.7317 - loss: 0.5373 - val_accuracy: 0.7558 - val_loss: 0.5035 Epoch 19/150 175/175 - 1s - 4ms/step - accuracy: 0.7316 - loss: 0.5367 - val_accuracy: 0.7526 - val_loss: 0.5035 Epoch 20/150 175/175 - 1s - 3ms/step - accuracy: 0.7325 - loss: 0.5369 - val_accuracy: 0.7533 - val_loss: 0.5026 Epoch 21/150 175/175 - 1s - 4ms/step - accuracy: 0.7290 - loss: 0.5358 - val_accuracy: 0.7537 - val_loss: 0.5020 Epoch 22/150 175/175 - 1s - 3ms/step - accuracy: 0.7318 - loss: 0.5327 - val_accuracy: 0.7547 - val_loss: 0.5005 Epoch 23/150 175/175 - 1s - 4ms/step - accuracy: 0.7310 - loss: 0.5319 - val_accuracy: 0.7530 - val_loss: 0.4998 Epoch 24/150 175/175 - 1s - 3ms/step - accuracy: 0.7310 - loss: 0.5337 - val_accuracy: 0.7551 - val_loss: 0.4991 Epoch 25/150 175/175 - 1s - 4ms/step - accuracy: 0.7349 - loss: 0.5275 - val_accuracy: 0.7540 - val_loss: 0.4985 Epoch 26/150 175/175 - 0s - 3ms/step - accuracy: 0.7341 - loss: 0.5288 - val_accuracy: 0.7554 - val_loss: 0.4973 Epoch 27/150 175/175 - 1s - 3ms/step - accuracy: 0.7357 - loss: 0.5268 - val_accuracy: 0.7551 - val_loss: 0.4964 Epoch 28/150 175/175 - 1s - 3ms/step - accuracy: 0.7328 - loss: 0.5305 - val_accuracy: 0.7565 - val_loss: 0.4962 Epoch 29/150 175/175 - 1s - 3ms/step - accuracy: 0.7411 - loss: 0.5254 - val_accuracy: 0.7572 - val_loss: 0.4954 Epoch 30/150 175/175 - 1s - 5ms/step - accuracy: 0.7396 - loss: 0.5247 - val_accuracy: 0.7569 - val_loss: 0.4951 Epoch 31/150 175/175 - 1s - 6ms/step - accuracy: 0.7426 - loss: 0.5234 - val_accuracy: 0.7604 - val_loss: 0.4939 Epoch 32/150 175/175 - 1s - 7ms/step - accuracy: 0.7408 - loss: 0.5232 - val_accuracy: 0.7600 - val_loss: 0.4932 Epoch 33/150 175/175 - 1s - 6ms/step - accuracy: 0.7414 - loss: 0.5197 - val_accuracy: 0.7621 - val_loss: 0.4925 Epoch 34/150 175/175 - 1s - 4ms/step - accuracy: 0.7411 - loss: 0.5214 - val_accuracy: 0.7625 - val_loss: 0.4916 Epoch 35/150 175/175 - 1s - 3ms/step - accuracy: 0.7441 - loss: 0.5211 - val_accuracy: 0.7649 - val_loss: 0.4907 Epoch 36/150 175/175 - 1s - 4ms/step - accuracy: 0.7421 - loss: 0.5189 - val_accuracy: 0.7653 - val_loss: 0.4899 Epoch 37/150 175/175 - 1s - 3ms/step - accuracy: 0.7429 - loss: 0.5204 - val_accuracy: 0.7674 - val_loss: 0.4891 Epoch 38/150 175/175 - 1s - 3ms/step - accuracy: 0.7465 - loss: 0.5183 - val_accuracy: 0.7646 - val_loss: 0.4883 Epoch 39/150 175/175 - 1s - 4ms/step - accuracy: 0.7471 - loss: 0.5142 - val_accuracy: 0.7677 - val_loss: 0.4869 Epoch 40/150 175/175 - 0s - 3ms/step - accuracy: 0.7453 - loss: 0.5149 - val_accuracy: 0.7688 - val_loss: 0.4861 Epoch 41/150 175/175 - 1s - 3ms/step - accuracy: 0.7476 - loss: 0.5160 - val_accuracy: 0.7625 - val_loss: 0.4863 Epoch 42/150 175/175 - 1s - 3ms/step - accuracy: 0.7494 - loss: 0.5142 - val_accuracy: 0.7684 - val_loss: 0.4843 Epoch 43/150 175/175 - 0s - 3ms/step - accuracy: 0.7535 - loss: 0.5102 - val_accuracy: 0.7699 - val_loss: 0.4835 Epoch 44/150 175/175 - 1s - 3ms/step - accuracy: 0.7478 - loss: 0.5121 - val_accuracy: 0.7699 - val_loss: 0.4825 Epoch 45/150 175/175 - 1s - 3ms/step - accuracy: 0.7489 - loss: 0.5124 - val_accuracy: 0.7702 - val_loss: 0.4814 Epoch 46/150 175/175 - 1s - 3ms/step - accuracy: 0.7499 - loss: 0.5100 - val_accuracy: 0.7688 - val_loss: 0.4805 Epoch 47/150 175/175 - 1s - 4ms/step - accuracy: 0.7497 - loss: 0.5105 - val_accuracy: 0.7706 - val_loss: 0.4796 Epoch 48/150 175/175 - 1s - 3ms/step - accuracy: 0.7534 - loss: 0.5089 - val_accuracy: 0.7713 - val_loss: 0.4787 Epoch 49/150 175/175 - 1s - 3ms/step - accuracy: 0.7537 - loss: 0.5052 - val_accuracy: 0.7741 - val_loss: 0.4772 Epoch 50/150 175/175 - 1s - 4ms/step - accuracy: 0.7538 - loss: 0.5070 - val_accuracy: 0.7744 - val_loss: 0.4764 Epoch 51/150 175/175 - 1s - 7ms/step - accuracy: 0.7565 - loss: 0.5043 - val_accuracy: 0.7723 - val_loss: 0.4757 Epoch 52/150 175/175 - 1s - 7ms/step - accuracy: 0.7554 - loss: 0.5074 - val_accuracy: 0.7723 - val_loss: 0.4741 Epoch 53/150 175/175 - 1s - 6ms/step - accuracy: 0.7549 - loss: 0.5032 - val_accuracy: 0.7727 - val_loss: 0.4738 Epoch 54/150 175/175 - 1s - 4ms/step - accuracy: 0.7562 - loss: 0.5023 - val_accuracy: 0.7741 - val_loss: 0.4729 Epoch 55/150 175/175 - 0s - 3ms/step - accuracy: 0.7570 - loss: 0.5013 - val_accuracy: 0.7758 - val_loss: 0.4721 Epoch 56/150 175/175 - 1s - 4ms/step - accuracy: 0.7625 - loss: 0.4969 - val_accuracy: 0.7755 - val_loss: 0.4709 Epoch 57/150 175/175 - 1s - 3ms/step - accuracy: 0.7610 - loss: 0.4963 - val_accuracy: 0.7748 - val_loss: 0.4699 Epoch 58/150 175/175 - 1s - 4ms/step - accuracy: 0.7626 - loss: 0.4963 - val_accuracy: 0.7800 - val_loss: 0.4684 Epoch 59/150 175/175 - 1s - 3ms/step - accuracy: 0.7578 - loss: 0.4976 - val_accuracy: 0.7779 - val_loss: 0.4675 Epoch 60/150 175/175 - 1s - 4ms/step - accuracy: 0.7617 - loss: 0.4933 - val_accuracy: 0.7783 - val_loss: 0.4668 Epoch 61/150 175/175 - 1s - 4ms/step - accuracy: 0.7602 - loss: 0.4960 - val_accuracy: 0.7786 - val_loss: 0.4659 Epoch 62/150 175/175 - 1s - 3ms/step - accuracy: 0.7648 - loss: 0.4905 - val_accuracy: 0.7772 - val_loss: 0.4651 Epoch 63/150 175/175 - 0s - 3ms/step - accuracy: 0.7627 - loss: 0.4925 - val_accuracy: 0.7772 - val_loss: 0.4647 Epoch 64/150 175/175 - 1s - 3ms/step - accuracy: 0.7646 - loss: 0.4922 - val_accuracy: 0.7786 - val_loss: 0.4634 Epoch 65/150 175/175 - 1s - 3ms/step - accuracy: 0.7606 - loss: 0.4918 - val_accuracy: 0.7783 - val_loss: 0.4623 Epoch 66/150 175/175 - 1s - 3ms/step - accuracy: 0.7666 - loss: 0.4868 - val_accuracy: 0.7772 - val_loss: 0.4618 Epoch 67/150 175/175 - 0s - 3ms/step - accuracy: 0.7663 - loss: 0.4887 - val_accuracy: 0.7776 - val_loss: 0.4614 Epoch 68/150 175/175 - 1s - 4ms/step - accuracy: 0.7650 - loss: 0.4867 - val_accuracy: 0.7790 - val_loss: 0.4597 Epoch 69/150 175/175 - 1s - 5ms/step - accuracy: 0.7721 - loss: 0.4834 - val_accuracy: 0.7790 - val_loss: 0.4587 Epoch 70/150 175/175 - 1s - 7ms/step - accuracy: 0.7687 - loss: 0.4845 - val_accuracy: 0.7804 - val_loss: 0.4581 Epoch 71/150 175/175 - 1s - 7ms/step - accuracy: 0.7693 - loss: 0.4820 - val_accuracy: 0.7807 - val_loss: 0.4568 Epoch 72/150 175/175 - 1s - 3ms/step - accuracy: 0.7716 - loss: 0.4806 - val_accuracy: 0.7800 - val_loss: 0.4560 Epoch 73/150 175/175 - 1s - 3ms/step - accuracy: 0.7664 - loss: 0.4856 - val_accuracy: 0.7818 - val_loss: 0.4561 Epoch 74/150 175/175 - 1s - 3ms/step - accuracy: 0.7672 - loss: 0.4825 - val_accuracy: 0.7818 - val_loss: 0.4547 Epoch 75/150 175/175 - 1s - 4ms/step - accuracy: 0.7684 - loss: 0.4833 - val_accuracy: 0.7825 - val_loss: 0.4541 Epoch 76/150 175/175 - 1s - 3ms/step - accuracy: 0.7680 - loss: 0.4840 - val_accuracy: 0.7822 - val_loss: 0.4534 Epoch 77/150 175/175 - 1s - 3ms/step - accuracy: 0.7698 - loss: 0.4798 - val_accuracy: 0.7836 - val_loss: 0.4525 Epoch 78/150 175/175 - 1s - 4ms/step - accuracy: 0.7691 - loss: 0.4792 - val_accuracy: 0.7836 - val_loss: 0.4518 Epoch 79/150 175/175 - 1s - 3ms/step - accuracy: 0.7681 - loss: 0.4790 - val_accuracy: 0.7853 - val_loss: 0.4506 Epoch 80/150 175/175 - 1s - 3ms/step - accuracy: 0.7724 - loss: 0.4774 - val_accuracy: 0.7846 - val_loss: 0.4505 Epoch 81/150 175/175 - 1s - 4ms/step - accuracy: 0.7743 - loss: 0.4761 - val_accuracy: 0.7832 - val_loss: 0.4498 Epoch 82/150 175/175 - 1s - 3ms/step - accuracy: 0.7744 - loss: 0.4726 - val_accuracy: 0.7843 - val_loss: 0.4492 Epoch 83/150 175/175 - 0s - 3ms/step - accuracy: 0.7748 - loss: 0.4747 - val_accuracy: 0.7839 - val_loss: 0.4484 Epoch 84/150 175/175 - 1s - 4ms/step - accuracy: 0.7758 - loss: 0.4743 - val_accuracy: 0.7836 - val_loss: 0.4479 Epoch 85/150 175/175 - 1s - 3ms/step - accuracy: 0.7796 - loss: 0.4696 - val_accuracy: 0.7850 - val_loss: 0.4471 Epoch 86/150 175/175 - 1s - 4ms/step - accuracy: 0.7750 - loss: 0.4748 - val_accuracy: 0.7839 - val_loss: 0.4467 Epoch 87/150 175/175 - 1s - 4ms/step - accuracy: 0.7784 - loss: 0.4717 - val_accuracy: 0.7860 - val_loss: 0.4462 Epoch 88/150 175/175 - 1s - 5ms/step - accuracy: 0.7761 - loss: 0.4705 - val_accuracy: 0.7850 - val_loss: 0.4458 Epoch 89/150 175/175 - 1s - 7ms/step - accuracy: 0.7816 - loss: 0.4657 - val_accuracy: 0.7829 - val_loss: 0.4454 Epoch 90/150 175/175 - 1s - 7ms/step - accuracy: 0.7839 - loss: 0.4681 - val_accuracy: 0.7916 - val_loss: 0.4439 Epoch 91/150 175/175 - 1s - 6ms/step - accuracy: 0.7787 - loss: 0.4691 - val_accuracy: 0.7867 - val_loss: 0.4433 Epoch 92/150 175/175 - 1s - 4ms/step - accuracy: 0.7806 - loss: 0.4662 - val_accuracy: 0.7853 - val_loss: 0.4424 Epoch 93/150 175/175 - 1s - 3ms/step - accuracy: 0.7798 - loss: 0.4654 - val_accuracy: 0.7895 - val_loss: 0.4420 Epoch 94/150 175/175 - 0s - 3ms/step - accuracy: 0.7815 - loss: 0.4637 - val_accuracy: 0.7892 - val_loss: 0.4415 Epoch 95/150 175/175 - 1s - 3ms/step - accuracy: 0.7780 - loss: 0.4682 - val_accuracy: 0.7881 - val_loss: 0.4415 Epoch 96/150 175/175 - 1s - 4ms/step - accuracy: 0.7800 - loss: 0.4663 - val_accuracy: 0.7867 - val_loss: 0.4408 Epoch 97/150 175/175 - 1s - 3ms/step - accuracy: 0.7840 - loss: 0.4608 - val_accuracy: 0.7874 - val_loss: 0.4405 Epoch 98/150 175/175 - 1s - 3ms/step - accuracy: 0.7811 - loss: 0.4637 - val_accuracy: 0.7867 - val_loss: 0.4405 Epoch 99/150 175/175 - 0s - 3ms/step - accuracy: 0.7824 - loss: 0.4618 - val_accuracy: 0.7867 - val_loss: 0.4401 Epoch 100/150 175/175 - 1s - 3ms/step - accuracy: 0.7821 - loss: 0.4605 - val_accuracy: 0.7874 - val_loss: 0.4396 Epoch 101/150 175/175 - 1s - 3ms/step - accuracy: 0.7817 - loss: 0.4609 - val_accuracy: 0.7885 - val_loss: 0.4392 Epoch 102/150 175/175 - 1s - 4ms/step - accuracy: 0.7817 - loss: 0.4612 - val_accuracy: 0.7885 - val_loss: 0.4386 Epoch 103/150 175/175 - 1s - 3ms/step - accuracy: 0.7821 - loss: 0.4602 - val_accuracy: 0.7874 - val_loss: 0.4385 Epoch 104/150 175/175 - 1s - 4ms/step - accuracy: 0.7845 - loss: 0.4551 - val_accuracy: 0.7885 - val_loss: 0.4378 Epoch 105/150 175/175 - 1s - 4ms/step - accuracy: 0.7814 - loss: 0.4600 - val_accuracy: 0.7892 - val_loss: 0.4379 Epoch 106/150 175/175 - 1s - 3ms/step - accuracy: 0.7844 - loss: 0.4585 - val_accuracy: 0.7878 - val_loss: 0.4375 Epoch 107/150 175/175 - 1s - 4ms/step - accuracy: 0.7832 - loss: 0.4591 - val_accuracy: 0.7885 - val_loss: 0.4370 Epoch 108/150 175/175 - 1s - 7ms/step - accuracy: 0.7860 - loss: 0.4561 - val_accuracy: 0.7902 - val_loss: 0.4364 Epoch 109/150 175/175 - 1s - 7ms/step - accuracy: 0.7847 - loss: 0.4573 - val_accuracy: 0.7892 - val_loss: 0.4365 Epoch 110/150 175/175 - 1s - 3ms/step - accuracy: 0.7842 - loss: 0.4569 - val_accuracy: 0.7892 - val_loss: 0.4358 Epoch 111/150 175/175 - 1s - 4ms/step - accuracy: 0.7881 - loss: 0.4559 - val_accuracy: 0.7867 - val_loss: 0.4357 Epoch 112/150 175/175 - 1s - 3ms/step - accuracy: 0.7859 - loss: 0.4570 - val_accuracy: 0.7871 - val_loss: 0.4353 Epoch 113/150 175/175 - 1s - 3ms/step - accuracy: 0.7848 - loss: 0.4584 - val_accuracy: 0.7899 - val_loss: 0.4350 Epoch 114/150 175/175 - 1s - 3ms/step - accuracy: 0.7875 - loss: 0.4525 - val_accuracy: 0.7895 - val_loss: 0.4345 Epoch 115/150 175/175 - 0s - 3ms/step - accuracy: 0.7868 - loss: 0.4550 - val_accuracy: 0.7920 - val_loss: 0.4346 Epoch 116/150 175/175 - 1s - 4ms/step - accuracy: 0.7875 - loss: 0.4535 - val_accuracy: 0.7902 - val_loss: 0.4340 Epoch 117/150 175/175 - 0s - 3ms/step - accuracy: 0.7862 - loss: 0.4539 - val_accuracy: 0.7892 - val_loss: 0.4332 Epoch 118/150 175/175 - 1s - 4ms/step - accuracy: 0.7893 - loss: 0.4506 - val_accuracy: 0.7895 - val_loss: 0.4331 Epoch 119/150 175/175 - 1s - 3ms/step - accuracy: 0.7865 - loss: 0.4508 - val_accuracy: 0.7920 - val_loss: 0.4329 Epoch 120/150 175/175 - 1s - 3ms/step - accuracy: 0.7916 - loss: 0.4513 - val_accuracy: 0.7930 - val_loss: 0.4326 Epoch 121/150 175/175 - 1s - 3ms/step - accuracy: 0.7877 - loss: 0.4485 - val_accuracy: 0.7920 - val_loss: 0.4323 Epoch 122/150 175/175 - 0s - 3ms/step - accuracy: 0.7892 - loss: 0.4503 - val_accuracy: 0.7927 - val_loss: 0.4323 Epoch 123/150 175/175 - 1s - 4ms/step - accuracy: 0.7896 - loss: 0.4487 - val_accuracy: 0.7895 - val_loss: 0.4324 Epoch 124/150 175/175 - 1s - 3ms/step - accuracy: 0.7897 - loss: 0.4516 - val_accuracy: 0.7937 - val_loss: 0.4311 Epoch 125/150 175/175 - 0s - 3ms/step - accuracy: 0.7947 - loss: 0.4468 - val_accuracy: 0.7923 - val_loss: 0.4300 Epoch 126/150 175/175 - 1s - 4ms/step - accuracy: 0.7911 - loss: 0.4465 - val_accuracy: 0.7944 - val_loss: 0.4304 Epoch 127/150 175/175 - 1s - 4ms/step - accuracy: 0.7911 - loss: 0.4513 - val_accuracy: 0.7895 - val_loss: 0.4297 Epoch 128/150 175/175 - 1s - 7ms/step - accuracy: 0.7894 - loss: 0.4491 - val_accuracy: 0.7885 - val_loss: 0.4294 Epoch 129/150 175/175 - 1s - 7ms/step - accuracy: 0.7952 - loss: 0.4447 - val_accuracy: 0.7952 - val_loss: 0.4285 Epoch 130/150 175/175 - 1s - 6ms/step - accuracy: 0.7909 - loss: 0.4468 - val_accuracy: 0.7941 - val_loss: 0.4286 Epoch 131/150 175/175 - 1s - 4ms/step - accuracy: 0.7919 - loss: 0.4487 - val_accuracy: 0.7923 - val_loss: 0.4279 Epoch 132/150 175/175 - 1s - 3ms/step - accuracy: 0.7876 - loss: 0.4478 - val_accuracy: 0.7927 - val_loss: 0.4283 Epoch 133/150 175/175 - 1s - 4ms/step - accuracy: 0.7891 - loss: 0.4467 - val_accuracy: 0.7934 - val_loss: 0.4276 Epoch 134/150 175/175 - 0s - 3ms/step - accuracy: 0.7963 - loss: 0.4417 - val_accuracy: 0.7923 - val_loss: 0.4272 Epoch 135/150 175/175 - 1s - 3ms/step - accuracy: 0.7924 - loss: 0.4452 - val_accuracy: 0.7983 - val_loss: 0.4271 Epoch 136/150 175/175 - 1s - 3ms/step - accuracy: 0.7931 - loss: 0.4456 - val_accuracy: 0.7934 - val_loss: 0.4266 Epoch 137/150 175/175 - 1s - 3ms/step - accuracy: 0.7942 - loss: 0.4462 - val_accuracy: 0.7934 - val_loss: 0.4263 Epoch 138/150 175/175 - 1s - 3ms/step - accuracy: 0.7974 - loss: 0.4405 - val_accuracy: 0.7948 - val_loss: 0.4260 Epoch 139/150 175/175 - 1s - 4ms/step - accuracy: 0.7936 - loss: 0.4449 - val_accuracy: 0.7976 - val_loss: 0.4257 Epoch 140/150 175/175 - 1s - 4ms/step - accuracy: 0.7916 - loss: 0.4434 - val_accuracy: 0.7994 - val_loss: 0.4253 Epoch 141/150 175/175 - 1s - 3ms/step - accuracy: 0.7937 - loss: 0.4427 - val_accuracy: 0.7983 - val_loss: 0.4251 Epoch 142/150 175/175 - 1s - 3ms/step - accuracy: 0.7973 - loss: 0.4408 - val_accuracy: 0.7955 - val_loss: 0.4239 Epoch 143/150 175/175 - 0s - 3ms/step - accuracy: 0.7930 - loss: 0.4404 - val_accuracy: 0.7923 - val_loss: 0.4239 Epoch 144/150 175/175 - 1s - 3ms/step - accuracy: 0.7914 - loss: 0.4447 - val_accuracy: 0.7983 - val_loss: 0.4233 Epoch 145/150 175/175 - 1s - 4ms/step - accuracy: 0.7956 - loss: 0.4429 - val_accuracy: 0.7983 - val_loss: 0.4227 Epoch 146/150 175/175 - 1s - 5ms/step - accuracy: 0.7927 - loss: 0.4385 - val_accuracy: 0.7959 - val_loss: 0.4223 Epoch 147/150 175/175 - 1s - 6ms/step - accuracy: 0.7977 - loss: 0.4401 - val_accuracy: 0.7944 - val_loss: 0.4222 Epoch 148/150 175/175 - 1s - 4ms/step - accuracy: 0.7970 - loss: 0.4411 - val_accuracy: 0.7973 - val_loss: 0.4223 Epoch 149/150 175/175 - 1s - 6ms/step - accuracy: 0.7967 - loss: 0.4393 - val_accuracy: 0.7944 - val_loss: 0.4213 Epoch 150/150 175/175 - 1s - 4ms/step - accuracy: 0.7930 - loss: 0.4411 - val_accuracy: 0.7966 - val_loss: 0.4206
#Observing Accuracy plot
plot(history17,'accuracy')
# Observing Loss curve
plot(history17,'loss')
#model comparision
results.loc[17] = [2,[64,32],['relu','tanh'],150,32,"SGD",['-', "-"],'xavier',["SMOTE"],history17.history["loss"][-1],history17.history["val_loss"][-1],history17.history["accuracy"][-1],history17.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
| 15 | 2 | [64, 32] | [tanh, relu] | 150 | 64 | adam | [0.01, -] | xavier | [dropout-0.2, dropout-0.1] | 0.812251 | 0.474700 | 0.831714 | 0.803889 | -7129.34 |
| 16 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [SMOTE] | 0.181068 | 0.819016 | 0.932826 | 0.744554 | -7129.34 |
| 17 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | SGD | [-, -] | xavier | [SMOTE] | 0.441068 | 0.420571 | 0.793023 | 0.796557 | -7129.34 |
#observing model performance for training data
model_performance_classification(model17, X_train_over, y_train_over)
350/350 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.811181 | 0.811181 | 0.811379 | 0.811151 |
#Observing model performance for validation data
model_performance_classification(model17, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.788333 | 0.788333 | 0.826113 | 0.800716 |
Observations
- With this model,it improved the training loss.
- Gets better validation accuracy than Adam with SMOTE.
Neural Network with Balanced Data (by applying SMOTE), Adam Optimizer, and Dropout¶
#clear keras session
tf.keras.backend.clear_session()
#define model with dropout
model18 = Sequential()
model18.add(Dense(64,activation="relu",input_dim=X_train.shape[1]))
model18.add(Dropout(0.2))
model18.add(Dense(32,activation="tanh"))
model18.add(Dropout(0.1))
model18.add(Dense(1,activation="sigmoid"))
#model summary
model18.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout (Dropout) │ (None, 64) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dropout_1 (Dropout) │ (None, 32) │ 0 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#compile model
optimizer = tf.keras.optimizers.Adam(0.01) # defining Adam as the optimizer to be used
model18.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#start training model
start = time.time()
history18 = model18.fit(X_train_over, y_train_over, validation_data=(X_val_over,y_val_over),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/150 350/350 - 2s - 7ms/step - accuracy: 0.6313 - loss: 1.2642 - val_accuracy: 0.6535 - val_loss: 0.6796 Epoch 2/150 350/350 - 1s - 2ms/step - accuracy: 0.6868 - loss: 1.1260 - val_accuracy: 0.6662 - val_loss: 0.7324 Epoch 3/150 350/350 - 1s - 2ms/step - accuracy: 0.7047 - loss: 1.0976 - val_accuracy: 0.7073 - val_loss: 0.5971 Epoch 4/150 350/350 - 1s - 4ms/step - accuracy: 0.7163 - loss: 1.0662 - val_accuracy: 0.7193 - val_loss: 0.5719 Epoch 5/150 350/350 - 1s - 4ms/step - accuracy: 0.7235 - loss: 1.0581 - val_accuracy: 0.7277 - val_loss: 0.5990 Epoch 6/150 350/350 - 1s - 4ms/step - accuracy: 0.7290 - loss: 1.0409 - val_accuracy: 0.7045 - val_loss: 0.6047 Epoch 7/150 350/350 - 1s - 3ms/step - accuracy: 0.7429 - loss: 1.0113 - val_accuracy: 0.7249 - val_loss: 0.6059 Epoch 8/150 350/350 - 1s - 3ms/step - accuracy: 0.7405 - loss: 1.0129 - val_accuracy: 0.7431 - val_loss: 0.5343 Epoch 9/150 350/350 - 2s - 4ms/step - accuracy: 0.7457 - loss: 0.9910 - val_accuracy: 0.7417 - val_loss: 0.5710 Epoch 10/150 350/350 - 2s - 6ms/step - accuracy: 0.7486 - loss: 0.9858 - val_accuracy: 0.7470 - val_loss: 0.5289 Epoch 11/150 350/350 - 1s - 2ms/step - accuracy: 0.7559 - loss: 0.9760 - val_accuracy: 0.7330 - val_loss: 0.6088 Epoch 12/150 350/350 - 1s - 4ms/step - accuracy: 0.7509 - loss: 0.9742 - val_accuracy: 0.7653 - val_loss: 0.5081 Epoch 13/150 350/350 - 1s - 3ms/step - accuracy: 0.7561 - loss: 0.9649 - val_accuracy: 0.7319 - val_loss: 0.6125 Epoch 14/150 350/350 - 1s - 4ms/step - accuracy: 0.7584 - loss: 0.9527 - val_accuracy: 0.7565 - val_loss: 0.5361 Epoch 15/150 350/350 - 1s - 4ms/step - accuracy: 0.7664 - loss: 0.9367 - val_accuracy: 0.7431 - val_loss: 0.5323 Epoch 16/150 350/350 - 1s - 4ms/step - accuracy: 0.7650 - loss: 0.9355 - val_accuracy: 0.7393 - val_loss: 0.5921 Epoch 17/150 350/350 - 1s - 2ms/step - accuracy: 0.7677 - loss: 0.9324 - val_accuracy: 0.7533 - val_loss: 0.5269 Epoch 18/150 350/350 - 2s - 4ms/step - accuracy: 0.7706 - loss: 0.9167 - val_accuracy: 0.7544 - val_loss: 0.5840 Epoch 19/150 350/350 - 1s - 4ms/step - accuracy: 0.7716 - loss: 0.9141 - val_accuracy: 0.7583 - val_loss: 0.5313 Epoch 20/150 350/350 - 2s - 6ms/step - accuracy: 0.7701 - loss: 0.9195 - val_accuracy: 0.7579 - val_loss: 0.5556 Epoch 21/150 350/350 - 1s - 3ms/step - accuracy: 0.7709 - loss: 0.9078 - val_accuracy: 0.7509 - val_loss: 0.5206 Epoch 22/150 350/350 - 1s - 4ms/step - accuracy: 0.7725 - loss: 0.9076 - val_accuracy: 0.7519 - val_loss: 0.5401 Epoch 23/150 350/350 - 1s - 4ms/step - accuracy: 0.7751 - loss: 0.9081 - val_accuracy: 0.7477 - val_loss: 0.5482 Epoch 24/150 350/350 - 1s - 4ms/step - accuracy: 0.7716 - loss: 0.9127 - val_accuracy: 0.7495 - val_loss: 0.5512 Epoch 25/150 350/350 - 1s - 4ms/step - accuracy: 0.7767 - loss: 0.9050 - val_accuracy: 0.7597 - val_loss: 0.5300 Epoch 26/150 350/350 - 1s - 4ms/step - accuracy: 0.7779 - loss: 0.8953 - val_accuracy: 0.7533 - val_loss: 0.5621 Epoch 27/150 350/350 - 1s - 2ms/step - accuracy: 0.7835 - loss: 0.8891 - val_accuracy: 0.7621 - val_loss: 0.5301 Epoch 28/150 350/350 - 2s - 4ms/step - accuracy: 0.7771 - loss: 0.8935 - val_accuracy: 0.7667 - val_loss: 0.5094 Epoch 29/150 350/350 - 1s - 4ms/step - accuracy: 0.7779 - loss: 0.8929 - val_accuracy: 0.7558 - val_loss: 0.5274 Epoch 30/150 350/350 - 2s - 6ms/step - accuracy: 0.7819 - loss: 0.8901 - val_accuracy: 0.7554 - val_loss: 0.5385 Epoch 31/150 350/350 - 1s - 4ms/step - accuracy: 0.7758 - loss: 0.8914 - val_accuracy: 0.7660 - val_loss: 0.5056 Epoch 32/150 350/350 - 1s - 3ms/step - accuracy: 0.7809 - loss: 0.8868 - val_accuracy: 0.7460 - val_loss: 0.5633 Epoch 33/150 350/350 - 1s - 3ms/step - accuracy: 0.7831 - loss: 0.8835 - val_accuracy: 0.7554 - val_loss: 0.5541 Epoch 34/150 350/350 - 1s - 3ms/step - accuracy: 0.7826 - loss: 0.8802 - val_accuracy: 0.7561 - val_loss: 0.5297 Epoch 35/150 350/350 - 1s - 3ms/step - accuracy: 0.7835 - loss: 0.8736 - val_accuracy: 0.7586 - val_loss: 0.5296 Epoch 36/150 350/350 - 1s - 3ms/step - accuracy: 0.7781 - loss: 0.8771 - val_accuracy: 0.7674 - val_loss: 0.5256 Epoch 37/150 350/350 - 1s - 3ms/step - accuracy: 0.7842 - loss: 0.8780 - val_accuracy: 0.7554 - val_loss: 0.5355 Epoch 38/150 350/350 - 1s - 4ms/step - accuracy: 0.7801 - loss: 0.8709 - val_accuracy: 0.7597 - val_loss: 0.5722 Epoch 39/150 350/350 - 2s - 5ms/step - accuracy: 0.7834 - loss: 0.8722 - val_accuracy: 0.7481 - val_loss: 0.5398 Epoch 40/150 350/350 - 1s - 4ms/step - accuracy: 0.7835 - loss: 0.8640 - val_accuracy: 0.7653 - val_loss: 0.5542 Epoch 41/150 350/350 - 2s - 6ms/step - accuracy: 0.7877 - loss: 0.8564 - val_accuracy: 0.7512 - val_loss: 0.5496 Epoch 42/150 350/350 - 1s - 4ms/step - accuracy: 0.7770 - loss: 0.8883 - val_accuracy: 0.7558 - val_loss: 0.5803 Epoch 43/150 350/350 - 1s - 2ms/step - accuracy: 0.7850 - loss: 0.8554 - val_accuracy: 0.7516 - val_loss: 0.5588 Epoch 44/150 350/350 - 1s - 4ms/step - accuracy: 0.7890 - loss: 0.8462 - val_accuracy: 0.7646 - val_loss: 0.5430 Epoch 45/150 350/350 - 1s - 4ms/step - accuracy: 0.7850 - loss: 0.8629 - val_accuracy: 0.7463 - val_loss: 0.5750 Epoch 46/150 350/350 - 1s - 4ms/step - accuracy: 0.7905 - loss: 0.8528 - val_accuracy: 0.7519 - val_loss: 0.5913 Epoch 47/150 350/350 - 1s - 2ms/step - accuracy: 0.7842 - loss: 0.8662 - val_accuracy: 0.7621 - val_loss: 0.5413 Epoch 48/150 350/350 - 2s - 4ms/step - accuracy: 0.7897 - loss: 0.8438 - val_accuracy: 0.7505 - val_loss: 0.5969 Epoch 49/150 350/350 - 1s - 4ms/step - accuracy: 0.7903 - loss: 0.8473 - val_accuracy: 0.7533 - val_loss: 0.5796 Epoch 50/150 350/350 - 2s - 6ms/step - accuracy: 0.7873 - loss: 0.8537 - val_accuracy: 0.7618 - val_loss: 0.5587 Epoch 51/150 350/350 - 1s - 3ms/step - accuracy: 0.7907 - loss: 0.8529 - val_accuracy: 0.7628 - val_loss: 0.5581 Epoch 52/150 350/350 - 1s - 3ms/step - accuracy: 0.7906 - loss: 0.8482 - val_accuracy: 0.7516 - val_loss: 0.5746 Epoch 53/150 350/350 - 1s - 2ms/step - accuracy: 0.7909 - loss: 0.8414 - val_accuracy: 0.7706 - val_loss: 0.5131 Epoch 54/150 350/350 - 1s - 2ms/step - accuracy: 0.7960 - loss: 0.8335 - val_accuracy: 0.7561 - val_loss: 0.5679 Epoch 55/150 350/350 - 1s - 4ms/step - accuracy: 0.7948 - loss: 0.8426 - val_accuracy: 0.7614 - val_loss: 0.5436 Epoch 56/150 350/350 - 1s - 2ms/step - accuracy: 0.7870 - loss: 0.8541 - val_accuracy: 0.7544 - val_loss: 0.5646 Epoch 57/150 350/350 - 1s - 2ms/step - accuracy: 0.7870 - loss: 0.8496 - val_accuracy: 0.7576 - val_loss: 0.5625 Epoch 58/150 350/350 - 1s - 4ms/step - accuracy: 0.7942 - loss: 0.8380 - val_accuracy: 0.7505 - val_loss: 0.5734 Epoch 59/150 350/350 - 2s - 5ms/step - accuracy: 0.7900 - loss: 0.8525 - val_accuracy: 0.7403 - val_loss: 0.6071 Epoch 60/150 350/350 - 1s - 4ms/step - accuracy: 0.7909 - loss: 0.8436 - val_accuracy: 0.7607 - val_loss: 0.5306 Epoch 61/150 350/350 - 2s - 6ms/step - accuracy: 0.8000 - loss: 0.8249 - val_accuracy: 0.7586 - val_loss: 0.5743 Epoch 62/150 350/350 - 1s - 4ms/step - accuracy: 0.7982 - loss: 0.8402 - val_accuracy: 0.7512 - val_loss: 0.5510 Epoch 63/150 350/350 - 1s - 4ms/step - accuracy: 0.7934 - loss: 0.8416 - val_accuracy: 0.7477 - val_loss: 0.5652 Epoch 64/150 350/350 - 1s - 4ms/step - accuracy: 0.7943 - loss: 0.8371 - val_accuracy: 0.7533 - val_loss: 0.5724 Epoch 65/150 350/350 - 1s - 4ms/step - accuracy: 0.7921 - loss: 0.8409 - val_accuracy: 0.7625 - val_loss: 0.5528 Epoch 66/150 350/350 - 1s - 3ms/step - accuracy: 0.7909 - loss: 0.8389 - val_accuracy: 0.7491 - val_loss: 0.5677 Epoch 67/150 350/350 - 1s - 3ms/step - accuracy: 0.8002 - loss: 0.8286 - val_accuracy: 0.7569 - val_loss: 0.5516 Epoch 68/150 350/350 - 2s - 4ms/step - accuracy: 0.7960 - loss: 0.8296 - val_accuracy: 0.7547 - val_loss: 0.5875 Epoch 69/150 350/350 - 4s - 13ms/step - accuracy: 0.7902 - loss: 0.8472 - val_accuracy: 0.7642 - val_loss: 0.5602 Epoch 70/150 350/350 - 3s - 10ms/step - accuracy: 0.7973 - loss: 0.8294 - val_accuracy: 0.7590 - val_loss: 0.5787 Epoch 71/150 350/350 - 2s - 6ms/step - accuracy: 0.7923 - loss: 0.8326 - val_accuracy: 0.7677 - val_loss: 0.5263 Epoch 72/150 350/350 - 3s - 7ms/step - accuracy: 0.7992 - loss: 0.8290 - val_accuracy: 0.7547 - val_loss: 0.5295 Epoch 73/150 350/350 - 2s - 6ms/step - accuracy: 0.7880 - loss: 0.8427 - val_accuracy: 0.7488 - val_loss: 0.5801 Epoch 74/150 350/350 - 1s - 4ms/step - accuracy: 0.7892 - loss: 0.8426 - val_accuracy: 0.7537 - val_loss: 0.5590 Epoch 75/150 350/350 - 2s - 6ms/step - accuracy: 0.7983 - loss: 0.8258 - val_accuracy: 0.7547 - val_loss: 0.5453 Epoch 76/150 350/350 - 1s - 4ms/step - accuracy: 0.7869 - loss: 0.8422 - val_accuracy: 0.7716 - val_loss: 0.5473 Epoch 77/150 350/350 - 3s - 8ms/step - accuracy: 0.7994 - loss: 0.8167 - val_accuracy: 0.7491 - val_loss: 0.5607 Epoch 78/150 350/350 - 3s - 8ms/step - accuracy: 0.7956 - loss: 0.8261 - val_accuracy: 0.7593 - val_loss: 0.5398 Epoch 79/150 350/350 - 3s - 9ms/step - accuracy: 0.7939 - loss: 0.8279 - val_accuracy: 0.7505 - val_loss: 0.5709 Epoch 80/150 350/350 - 4s - 10ms/step - accuracy: 0.7979 - loss: 0.8259 - val_accuracy: 0.7512 - val_loss: 0.5678 Epoch 81/150 350/350 - 3s - 9ms/step - accuracy: 0.7948 - loss: 0.8327 - val_accuracy: 0.7460 - val_loss: 0.5706 Epoch 82/150 350/350 - 2s - 7ms/step - accuracy: 0.7933 - loss: 0.8388 - val_accuracy: 0.7569 - val_loss: 0.5568 Epoch 83/150 350/350 - 2s - 7ms/step - accuracy: 0.7956 - loss: 0.8269 - val_accuracy: 0.7533 - val_loss: 0.5531 Epoch 84/150 350/350 - 2s - 7ms/step - accuracy: 0.7949 - loss: 0.8397 - val_accuracy: 0.7649 - val_loss: 0.5363 Epoch 85/150 350/350 - 1s - 4ms/step - accuracy: 0.7978 - loss: 0.8244 - val_accuracy: 0.7586 - val_loss: 0.5674 Epoch 86/150 350/350 - 2s - 6ms/step - accuracy: 0.7940 - loss: 0.8337 - val_accuracy: 0.7600 - val_loss: 0.5311 Epoch 87/150 350/350 - 1s - 3ms/step - accuracy: 0.7919 - loss: 0.8294 - val_accuracy: 0.7583 - val_loss: 0.5584 Epoch 88/150 350/350 - 1s - 3ms/step - accuracy: 0.7950 - loss: 0.8379 - val_accuracy: 0.7611 - val_loss: 0.5545 Epoch 89/150 350/350 - 1s - 4ms/step - accuracy: 0.8017 - loss: 0.8165 - val_accuracy: 0.7635 - val_loss: 0.5670 Epoch 90/150 350/350 - 1s - 3ms/step - accuracy: 0.7982 - loss: 0.8257 - val_accuracy: 0.7590 - val_loss: 0.5656 Epoch 91/150 350/350 - 1s - 4ms/step - accuracy: 0.8010 - loss: 0.8131 - val_accuracy: 0.7498 - val_loss: 0.5794 Epoch 92/150 350/350 - 1s - 3ms/step - accuracy: 0.8033 - loss: 0.8055 - val_accuracy: 0.7646 - val_loss: 0.5382 Epoch 93/150 350/350 - 1s - 3ms/step - accuracy: 0.8041 - loss: 0.8084 - val_accuracy: 0.7597 - val_loss: 0.5461 Epoch 94/150 350/350 - 1s - 3ms/step - accuracy: 0.7976 - loss: 0.8237 - val_accuracy: 0.7576 - val_loss: 0.5560 Epoch 95/150 350/350 - 2s - 5ms/step - accuracy: 0.7996 - loss: 0.8140 - val_accuracy: 0.7593 - val_loss: 0.5835 Epoch 96/150 350/350 - 2s - 6ms/step - accuracy: 0.7981 - loss: 0.8177 - val_accuracy: 0.7618 - val_loss: 0.5695 Epoch 97/150 350/350 - 1s - 3ms/step - accuracy: 0.8006 - loss: 0.8140 - val_accuracy: 0.7505 - val_loss: 0.5694 Epoch 98/150 350/350 - 1s - 4ms/step - accuracy: 0.8030 - loss: 0.8104 - val_accuracy: 0.7590 - val_loss: 0.5793 Epoch 99/150 350/350 - 1s - 3ms/step - accuracy: 0.8005 - loss: 0.8126 - val_accuracy: 0.7396 - val_loss: 0.5615 Epoch 100/150 350/350 - 1s - 4ms/step - accuracy: 0.7961 - loss: 0.8120 - val_accuracy: 0.7523 - val_loss: 0.6031 Epoch 101/150 350/350 - 1s - 4ms/step - accuracy: 0.8009 - loss: 0.8208 - val_accuracy: 0.7674 - val_loss: 0.5497 Epoch 102/150 350/350 - 1s - 4ms/step - accuracy: 0.8063 - loss: 0.8066 - val_accuracy: 0.7537 - val_loss: 0.6053 Epoch 103/150 350/350 - 1s - 3ms/step - accuracy: 0.8023 - loss: 0.8077 - val_accuracy: 0.7590 - val_loss: 0.5558 Epoch 104/150 350/350 - 1s - 4ms/step - accuracy: 0.8040 - loss: 0.7997 - val_accuracy: 0.7632 - val_loss: 0.5773 Epoch 105/150 350/350 - 1s - 4ms/step - accuracy: 0.8057 - loss: 0.8046 - val_accuracy: 0.7713 - val_loss: 0.5363 Epoch 106/150 350/350 - 2s - 6ms/step - accuracy: 0.8040 - loss: 0.7979 - val_accuracy: 0.7702 - val_loss: 0.5485 Epoch 107/150 350/350 - 1s - 3ms/step - accuracy: 0.8018 - loss: 0.8106 - val_accuracy: 0.7737 - val_loss: 0.5307 Epoch 108/150 350/350 - 1s - 3ms/step - accuracy: 0.8011 - loss: 0.8080 - val_accuracy: 0.7597 - val_loss: 0.5808 Epoch 109/150 350/350 - 1s - 3ms/step - accuracy: 0.8036 - loss: 0.8142 - val_accuracy: 0.7618 - val_loss: 0.5572 Epoch 110/150 350/350 - 1s - 3ms/step - accuracy: 0.7970 - loss: 0.8183 - val_accuracy: 0.7618 - val_loss: 0.5389 Epoch 111/150 350/350 - 1s - 3ms/step - accuracy: 0.8014 - loss: 0.8082 - val_accuracy: 0.7576 - val_loss: 0.5558 Epoch 112/150 350/350 - 1s - 4ms/step - accuracy: 0.8041 - loss: 0.8149 - val_accuracy: 0.7498 - val_loss: 0.5920 Epoch 113/150 350/350 - 1s - 3ms/step - accuracy: 0.7964 - loss: 0.8151 - val_accuracy: 0.7579 - val_loss: 0.6024 Epoch 114/150 350/350 - 1s - 2ms/step - accuracy: 0.8047 - loss: 0.8007 - val_accuracy: 0.7509 - val_loss: 0.5671 Epoch 115/150 350/350 - 1s - 4ms/step - accuracy: 0.8017 - loss: 0.8063 - val_accuracy: 0.7604 - val_loss: 0.5598 Epoch 116/150 350/350 - 1s - 4ms/step - accuracy: 0.8033 - loss: 0.8063 - val_accuracy: 0.7576 - val_loss: 0.5347 Epoch 117/150 350/350 - 2s - 6ms/step - accuracy: 0.7970 - loss: 0.8130 - val_accuracy: 0.7649 - val_loss: 0.5543 Epoch 118/150 350/350 - 1s - 3ms/step - accuracy: 0.8069 - loss: 0.7919 - val_accuracy: 0.7642 - val_loss: 0.5681 Epoch 119/150 350/350 - 1s - 3ms/step - accuracy: 0.8062 - loss: 0.7969 - val_accuracy: 0.7572 - val_loss: 0.5795 Epoch 120/150 350/350 - 1s - 4ms/step - accuracy: 0.8020 - loss: 0.8025 - val_accuracy: 0.7597 - val_loss: 0.5582 Epoch 121/150 350/350 - 1s - 4ms/step - accuracy: 0.8021 - loss: 0.8051 - val_accuracy: 0.7695 - val_loss: 0.5359 Epoch 122/150 350/350 - 1s - 4ms/step - accuracy: 0.8077 - loss: 0.7991 - val_accuracy: 0.7583 - val_loss: 0.5762 Epoch 123/150 350/350 - 1s - 3ms/step - accuracy: 0.8031 - loss: 0.8011 - val_accuracy: 0.7642 - val_loss: 0.5721 Epoch 124/150 350/350 - 1s - 3ms/step - accuracy: 0.8075 - loss: 0.7814 - val_accuracy: 0.7699 - val_loss: 0.5805 Epoch 125/150 350/350 - 1s - 3ms/step - accuracy: 0.8038 - loss: 0.8089 - val_accuracy: 0.7569 - val_loss: 0.5424 Epoch 126/150 350/350 - 2s - 5ms/step - accuracy: 0.8088 - loss: 0.8040 - val_accuracy: 0.7544 - val_loss: 0.5339 Epoch 127/150 350/350 - 1s - 4ms/step - accuracy: 0.8007 - loss: 0.8106 - val_accuracy: 0.7691 - val_loss: 0.5540 Epoch 128/150 350/350 - 2s - 6ms/step - accuracy: 0.8052 - loss: 0.8056 - val_accuracy: 0.7677 - val_loss: 0.5484 Epoch 129/150 350/350 - 1s - 3ms/step - accuracy: 0.8008 - loss: 0.8134 - val_accuracy: 0.7625 - val_loss: 0.5535 Epoch 130/150 350/350 - 1s - 4ms/step - accuracy: 0.8031 - loss: 0.8179 - val_accuracy: 0.7558 - val_loss: 0.5665 Epoch 131/150 350/350 - 1s - 2ms/step - accuracy: 0.8024 - loss: 0.8109 - val_accuracy: 0.7628 - val_loss: 0.5565 Epoch 132/150 350/350 - 1s - 3ms/step - accuracy: 0.8057 - loss: 0.7915 - val_accuracy: 0.7586 - val_loss: 0.5498 Epoch 133/150 350/350 - 1s - 4ms/step - accuracy: 0.8046 - loss: 0.8001 - val_accuracy: 0.7646 - val_loss: 0.5489 Epoch 134/150 350/350 - 1s - 3ms/step - accuracy: 0.8013 - loss: 0.8218 - val_accuracy: 0.7600 - val_loss: 0.5419 Epoch 135/150 350/350 - 1s - 3ms/step - accuracy: 0.8037 - loss: 0.8007 - val_accuracy: 0.7670 - val_loss: 0.5369 Epoch 136/150 350/350 - 1s - 3ms/step - accuracy: 0.8042 - loss: 0.8052 - val_accuracy: 0.7551 - val_loss: 0.5269 Epoch 137/150 350/350 - 2s - 6ms/step - accuracy: 0.8017 - loss: 0.8194 - val_accuracy: 0.7579 - val_loss: 0.5773 Epoch 138/150 350/350 - 2s - 5ms/step - accuracy: 0.8021 - loss: 0.8025 - val_accuracy: 0.7663 - val_loss: 0.5280 Epoch 139/150 350/350 - 2s - 6ms/step - accuracy: 0.8058 - loss: 0.8004 - val_accuracy: 0.7558 - val_loss: 0.5561 Epoch 140/150 350/350 - 2s - 4ms/step - accuracy: 0.8004 - loss: 0.8040 - val_accuracy: 0.7561 - val_loss: 0.5504 Epoch 141/150 350/350 - 2s - 4ms/step - accuracy: 0.8044 - loss: 0.7976 - val_accuracy: 0.7656 - val_loss: 0.5216 Epoch 142/150 350/350 - 2s - 6ms/step - accuracy: 0.8064 - loss: 0.7967 - val_accuracy: 0.7635 - val_loss: 0.5683 Epoch 143/150 350/350 - 2s - 6ms/step - accuracy: 0.7992 - loss: 0.8142 - val_accuracy: 0.7618 - val_loss: 0.5654 Epoch 144/150 350/350 - 5s - 13ms/step - accuracy: 0.8038 - loss: 0.8089 - val_accuracy: 0.7660 - val_loss: 0.5386 Epoch 145/150 350/350 - 1s - 4ms/step - accuracy: 0.8011 - loss: 0.8131 - val_accuracy: 0.7639 - val_loss: 0.5464 Epoch 146/150 350/350 - 2s - 7ms/step - accuracy: 0.8039 - loss: 0.7990 - val_accuracy: 0.7649 - val_loss: 0.5429 Epoch 147/150 350/350 - 2s - 6ms/step - accuracy: 0.8046 - loss: 0.7982 - val_accuracy: 0.7681 - val_loss: 0.5429 Epoch 148/150 350/350 - 1s - 4ms/step - accuracy: 0.8114 - loss: 0.7913 - val_accuracy: 0.7695 - val_loss: 0.5401 Epoch 149/150 350/350 - 1s - 4ms/step - accuracy: 0.8025 - loss: 0.7988 - val_accuracy: 0.7561 - val_loss: 0.5880 Epoch 150/150 350/350 - 1s - 3ms/step - accuracy: 0.8054 - loss: 0.7965 - val_accuracy: 0.7477 - val_loss: 0.5484
#plot accuracy curve
plot(history18,'accuracy')
#plot loss curve
plot(history18,'loss')
#model comparision
results.loc[18] = [2,[64,32],['relu','tanh'],150,32,"adam",['0.01', "-"],'xavier',["dropout:0.2","dropout:0.1"],history18.history["loss"][-1],history18.history["val_loss"][-1],history18.history["accuracy"][-1],history18.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
| 15 | 2 | [64, 32] | [tanh, relu] | 150 | 64 | adam | [0.01, -] | xavier | [dropout-0.2, dropout-0.1] | 0.812251 | 0.474700 | 0.831714 | 0.803889 | -7129.34 |
| 16 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [SMOTE] | 0.181068 | 0.819016 | 0.932826 | 0.744554 | -7129.34 |
| 17 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | SGD | [-, -] | xavier | [SMOTE] | 0.441068 | 0.420571 | 0.793023 | 0.796557 | -7129.34 |
| 18 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [0.01, -] | xavier | [dropout:0.2, dropout:0.1] | 0.796456 | 0.548435 | 0.805367 | 0.747716 | -7129.34 |
#model performance for training data
model_performance_classification(model18, X_train_over, y_train_over)
350/350 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.806798 | 0.806798 | 0.85225 | 0.800358 |
#Model performance for validation
model_performance_classification(model18, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.640556 | 0.640556 | 0.819554 | 0.673358 |
Observations
- With this model,there is no any improvement in performance.
- Increasing learining rate and adding dropout layer it mitigate the issue of overfitting that we faced in model 16.
Model 18 with He weight initializer
#clear keras session
tf.keras.backend.clear_session()
#define model with He weight initialization technique
from tensorflow.keras import initializers
model19 = Sequential()
model19.add(Dense(64,activation="relu",kernel_initializer=initializers.HeNormal(),input_dim=X_train.shape[1]))
model19.add(Dense(32,activation="relu",kernel_initializer=initializers.HeUniform()))
model19.add(Dense(1,activation="sigmoid"))
#model summary
model19.summary()
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ dense (Dense) │ (None, 64) │ 768 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_1 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ dense_2 (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 2,881 (11.25 KB)
Trainable params: 2,881 (11.25 KB)
Non-trainable params: 0 (0.00 B)
#model compile
optimizer = tf.keras.optimizers.Adam() # defining Adam as the optimizer to be used
model19.compile(loss='binary_crossentropy', optimizer=optimizer,metrics=['accuracy'])
#Start Training model
start = time.time()
history19 = model19.fit(X_train_over, y_train_over, validation_data=(X_val,y_val),epochs=150,class_weight=cw_dict, verbose=2,batch_size = 32)
Epoch 1/150 350/350 - 2s - 7ms/step - accuracy: 0.6008 - loss: 1.3902 - val_accuracy: 0.4372 - val_loss: 1.0401 Epoch 2/150 350/350 - 1s - 2ms/step - accuracy: 0.6614 - loss: 1.2039 - val_accuracy: 0.4983 - val_loss: 0.8999 Epoch 3/150 350/350 - 1s - 3ms/step - accuracy: 0.6819 - loss: 1.1270 - val_accuracy: 0.5339 - val_loss: 0.8645 Epoch 4/150 350/350 - 1s - 2ms/step - accuracy: 0.7036 - loss: 1.0688 - val_accuracy: 0.5383 - val_loss: 0.9011 Epoch 5/150 350/350 - 1s - 3ms/step - accuracy: 0.7182 - loss: 1.0320 - val_accuracy: 0.5500 - val_loss: 0.8670 Epoch 6/150 350/350 - 1s - 2ms/step - accuracy: 0.7315 - loss: 1.0004 - val_accuracy: 0.5667 - val_loss: 0.8895 Epoch 7/150 350/350 - 1s - 4ms/step - accuracy: 0.7360 - loss: 0.9776 - val_accuracy: 0.5906 - val_loss: 0.8045 Epoch 8/150 350/350 - 2s - 6ms/step - accuracy: 0.7465 - loss: 0.9594 - val_accuracy: 0.6256 - val_loss: 0.7544 Epoch 9/150 350/350 - 3s - 10ms/step - accuracy: 0.7540 - loss: 0.9323 - val_accuracy: 0.5561 - val_loss: 0.9542 Epoch 10/150 350/350 - 4s - 12ms/step - accuracy: 0.7564 - loss: 0.9228 - val_accuracy: 0.6272 - val_loss: 0.7783 Epoch 11/150 350/350 - 2s - 5ms/step - accuracy: 0.7666 - loss: 0.9027 - val_accuracy: 0.6161 - val_loss: 0.8015 Epoch 12/150 350/350 - 2s - 5ms/step - accuracy: 0.7680 - loss: 0.8909 - val_accuracy: 0.6467 - val_loss: 0.7359 Epoch 13/150 350/350 - 2s - 5ms/step - accuracy: 0.7735 - loss: 0.8766 - val_accuracy: 0.6094 - val_loss: 0.8062 Epoch 14/150 350/350 - 4s - 13ms/step - accuracy: 0.7768 - loss: 0.8646 - val_accuracy: 0.6572 - val_loss: 0.7055 Epoch 15/150 350/350 - 3s - 9ms/step - accuracy: 0.7770 - loss: 0.8549 - val_accuracy: 0.6006 - val_loss: 0.8678 Epoch 16/150 350/350 - 2s - 7ms/step - accuracy: 0.7877 - loss: 0.8367 - val_accuracy: 0.6583 - val_loss: 0.7052 Epoch 17/150 350/350 - 3s - 7ms/step - accuracy: 0.7877 - loss: 0.8309 - val_accuracy: 0.6694 - val_loss: 0.6984 Epoch 18/150 350/350 - 3s - 10ms/step - accuracy: 0.7962 - loss: 0.8178 - val_accuracy: 0.6300 - val_loss: 0.8173 Epoch 19/150 350/350 - 3s - 10ms/step - accuracy: 0.7966 - loss: 0.8109 - val_accuracy: 0.6883 - val_loss: 0.6662 Epoch 20/150 350/350 - 4s - 10ms/step - accuracy: 0.7987 - loss: 0.8042 - val_accuracy: 0.6733 - val_loss: 0.6879 Epoch 21/150 350/350 - 3s - 8ms/step - accuracy: 0.8032 - loss: 0.7905 - val_accuracy: 0.7128 - val_loss: 0.6219 Epoch 22/150 350/350 - 3s - 8ms/step - accuracy: 0.8070 - loss: 0.7825 - val_accuracy: 0.6717 - val_loss: 0.6902 Epoch 23/150 350/350 - 3s - 10ms/step - accuracy: 0.8068 - loss: 0.7745 - val_accuracy: 0.6083 - val_loss: 0.8376 Epoch 24/150 350/350 - 3s - 9ms/step - accuracy: 0.8064 - loss: 0.7720 - val_accuracy: 0.6872 - val_loss: 0.6741 Epoch 25/150 350/350 - 1s - 3ms/step - accuracy: 0.8124 - loss: 0.7645 - val_accuracy: 0.6811 - val_loss: 0.7228 Epoch 26/150 350/350 - 1s - 4ms/step - accuracy: 0.8165 - loss: 0.7529 - val_accuracy: 0.6667 - val_loss: 0.7271 Epoch 27/150 350/350 - 1s - 4ms/step - accuracy: 0.8168 - loss: 0.7452 - val_accuracy: 0.7106 - val_loss: 0.6208 Epoch 28/150 350/350 - 1s - 4ms/step - accuracy: 0.8181 - loss: 0.7472 - val_accuracy: 0.7061 - val_loss: 0.6520 Epoch 29/150 350/350 - 3s - 7ms/step - accuracy: 0.8254 - loss: 0.7323 - val_accuracy: 0.6578 - val_loss: 0.7637 Epoch 30/150 350/350 - 4s - 10ms/step - accuracy: 0.8211 - loss: 0.7290 - val_accuracy: 0.6661 - val_loss: 0.7608 Epoch 31/150 350/350 - 2s - 5ms/step - accuracy: 0.8256 - loss: 0.7229 - val_accuracy: 0.7194 - val_loss: 0.6133 Epoch 32/150 350/350 - 2s - 7ms/step - accuracy: 0.8288 - loss: 0.7171 - val_accuracy: 0.6694 - val_loss: 0.7185 Epoch 33/150 350/350 - 2s - 5ms/step - accuracy: 0.8291 - loss: 0.7058 - val_accuracy: 0.7067 - val_loss: 0.6456 Epoch 34/150 350/350 - 1s - 3ms/step - accuracy: 0.8307 - loss: 0.7067 - val_accuracy: 0.6894 - val_loss: 0.6722 Epoch 35/150 350/350 - 1s - 4ms/step - accuracy: 0.8359 - loss: 0.6991 - val_accuracy: 0.6694 - val_loss: 0.7704 Epoch 36/150 350/350 - 1s - 3ms/step - accuracy: 0.8353 - loss: 0.6913 - val_accuracy: 0.7000 - val_loss: 0.6858 Epoch 37/150 350/350 - 2s - 5ms/step - accuracy: 0.8366 - loss: 0.6913 - val_accuracy: 0.6633 - val_loss: 0.7609 Epoch 38/150 350/350 - 4s - 11ms/step - accuracy: 0.8383 - loss: 0.6878 - val_accuracy: 0.6694 - val_loss: 0.7751 Epoch 39/150 350/350 - 2s - 7ms/step - accuracy: 0.8400 - loss: 0.6825 - val_accuracy: 0.6806 - val_loss: 0.7306 Epoch 40/150 350/350 - 1s - 4ms/step - accuracy: 0.8396 - loss: 0.6792 - val_accuracy: 0.7211 - val_loss: 0.6461 Epoch 41/150 350/350 - 2s - 6ms/step - accuracy: 0.8410 - loss: 0.6711 - val_accuracy: 0.7333 - val_loss: 0.6210 Epoch 42/150 350/350 - 1s - 4ms/step - accuracy: 0.8451 - loss: 0.6664 - val_accuracy: 0.7256 - val_loss: 0.6337 Epoch 43/150 350/350 - 1s - 4ms/step - accuracy: 0.8448 - loss: 0.6615 - val_accuracy: 0.6583 - val_loss: 0.7772 Epoch 44/150 350/350 - 1s - 4ms/step - accuracy: 0.8487 - loss: 0.6547 - val_accuracy: 0.6800 - val_loss: 0.7246 Epoch 45/150 350/350 - 2s - 4ms/step - accuracy: 0.8487 - loss: 0.6507 - val_accuracy: 0.6500 - val_loss: 0.8078 Epoch 46/150 350/350 - 1s - 4ms/step - accuracy: 0.8496 - loss: 0.6498 - val_accuracy: 0.7150 - val_loss: 0.6757 Epoch 47/150 350/350 - 1s - 3ms/step - accuracy: 0.8500 - loss: 0.6461 - val_accuracy: 0.6939 - val_loss: 0.7066 Epoch 48/150 350/350 - 1s - 3ms/step - accuracy: 0.8518 - loss: 0.6392 - val_accuracy: 0.7178 - val_loss: 0.6562 Epoch 49/150 350/350 - 1s - 4ms/step - accuracy: 0.8519 - loss: 0.6417 - val_accuracy: 0.7189 - val_loss: 0.6552 Epoch 50/150 350/350 - 1s - 4ms/step - accuracy: 0.8564 - loss: 0.6328 - val_accuracy: 0.6944 - val_loss: 0.7244 Epoch 51/150 350/350 - 1s - 4ms/step - accuracy: 0.8567 - loss: 0.6288 - val_accuracy: 0.7167 - val_loss: 0.6668 Epoch 52/150 350/350 - 1s - 2ms/step - accuracy: 0.8557 - loss: 0.6258 - val_accuracy: 0.6933 - val_loss: 0.7505 Epoch 53/150 350/350 - 1s - 3ms/step - accuracy: 0.8546 - loss: 0.6243 - val_accuracy: 0.7017 - val_loss: 0.6917 Epoch 54/150 350/350 - 1s - 4ms/step - accuracy: 0.8597 - loss: 0.6123 - val_accuracy: 0.7200 - val_loss: 0.6978 Epoch 55/150 350/350 - 1s - 2ms/step - accuracy: 0.8608 - loss: 0.6134 - val_accuracy: 0.7144 - val_loss: 0.6865 Epoch 56/150 350/350 - 1s - 3ms/step - accuracy: 0.8590 - loss: 0.6113 - val_accuracy: 0.7278 - val_loss: 0.6796 Epoch 57/150 350/350 - 1s - 4ms/step - accuracy: 0.8631 - loss: 0.6030 - val_accuracy: 0.7267 - val_loss: 0.6422 Epoch 58/150 350/350 - 2s - 6ms/step - accuracy: 0.8631 - loss: 0.6045 - val_accuracy: 0.6894 - val_loss: 0.7940 Epoch 59/150 350/350 - 1s - 2ms/step - accuracy: 0.8611 - loss: 0.6026 - val_accuracy: 0.7589 - val_loss: 0.6098 Epoch 60/150 350/350 - 1s - 4ms/step - accuracy: 0.8656 - loss: 0.5953 - val_accuracy: 0.7478 - val_loss: 0.6256 Epoch 61/150 350/350 - 1s - 4ms/step - accuracy: 0.8660 - loss: 0.5943 - val_accuracy: 0.7056 - val_loss: 0.7065 Epoch 62/150 350/350 - 1s - 4ms/step - accuracy: 0.8646 - loss: 0.5900 - val_accuracy: 0.7261 - val_loss: 0.6715 Epoch 63/150 350/350 - 1s - 4ms/step - accuracy: 0.8648 - loss: 0.5870 - val_accuracy: 0.7250 - val_loss: 0.6713 Epoch 64/150 350/350 - 1s - 4ms/step - accuracy: 0.8673 - loss: 0.5870 - val_accuracy: 0.7372 - val_loss: 0.6412 Epoch 65/150 350/350 - 1s - 3ms/step - accuracy: 0.8689 - loss: 0.5813 - val_accuracy: 0.7633 - val_loss: 0.6063 Epoch 66/150 350/350 - 2s - 5ms/step - accuracy: 0.8682 - loss: 0.5817 - val_accuracy: 0.7061 - val_loss: 0.7373 Epoch 67/150 350/350 - 1s - 4ms/step - accuracy: 0.8719 - loss: 0.5742 - val_accuracy: 0.7183 - val_loss: 0.7084 Epoch 68/150 350/350 - 1s - 4ms/step - accuracy: 0.8708 - loss: 0.5758 - val_accuracy: 0.7450 - val_loss: 0.6438 Epoch 69/150 350/350 - 1s - 2ms/step - accuracy: 0.8720 - loss: 0.5697 - val_accuracy: 0.7606 - val_loss: 0.5965 Epoch 70/150 350/350 - 1s - 2ms/step - accuracy: 0.8731 - loss: 0.5667 - val_accuracy: 0.7122 - val_loss: 0.7389 Epoch 71/150 350/350 - 1s - 3ms/step - accuracy: 0.8699 - loss: 0.5652 - val_accuracy: 0.7417 - val_loss: 0.6383 Epoch 72/150 350/350 - 1s - 2ms/step - accuracy: 0.8716 - loss: 0.5641 - val_accuracy: 0.7367 - val_loss: 0.6732 Epoch 73/150 350/350 - 1s - 4ms/step - accuracy: 0.8723 - loss: 0.5626 - val_accuracy: 0.7622 - val_loss: 0.6103 Epoch 74/150 350/350 - 1s - 2ms/step - accuracy: 0.8741 - loss: 0.5586 - val_accuracy: 0.7389 - val_loss: 0.6588 Epoch 75/150 350/350 - 1s - 4ms/step - accuracy: 0.8756 - loss: 0.5595 - val_accuracy: 0.7339 - val_loss: 0.6905 Epoch 76/150 350/350 - 1s - 2ms/step - accuracy: 0.8745 - loss: 0.5603 - val_accuracy: 0.7167 - val_loss: 0.7457 Epoch 77/150 350/350 - 1s - 3ms/step - accuracy: 0.8763 - loss: 0.5524 - val_accuracy: 0.7356 - val_loss: 0.6840 Epoch 78/150 350/350 - 2s - 5ms/step - accuracy: 0.8801 - loss: 0.5468 - val_accuracy: 0.7306 - val_loss: 0.6940 Epoch 79/150 350/350 - 1s - 4ms/step - accuracy: 0.8768 - loss: 0.5474 - val_accuracy: 0.7333 - val_loss: 0.6930 Epoch 80/150 350/350 - 2s - 4ms/step - accuracy: 0.8785 - loss: 0.5418 - val_accuracy: 0.7144 - val_loss: 0.7703 Epoch 81/150 350/350 - 2s - 6ms/step - accuracy: 0.8802 - loss: 0.5406 - val_accuracy: 0.7444 - val_loss: 0.6705 Epoch 82/150 350/350 - 1s - 3ms/step - accuracy: 0.8758 - loss: 0.5470 - val_accuracy: 0.7194 - val_loss: 0.7284 Epoch 83/150 350/350 - 1s - 2ms/step - accuracy: 0.8802 - loss: 0.5397 - val_accuracy: 0.7422 - val_loss: 0.6782 Epoch 84/150 350/350 - 1s - 2ms/step - accuracy: 0.8846 - loss: 0.5309 - val_accuracy: 0.7139 - val_loss: 0.7574 Epoch 85/150 350/350 - 1s - 4ms/step - accuracy: 0.8809 - loss: 0.5412 - val_accuracy: 0.7544 - val_loss: 0.6305 Epoch 86/150 350/350 - 1s - 2ms/step - accuracy: 0.8832 - loss: 0.5311 - val_accuracy: 0.7011 - val_loss: 0.7946 Epoch 87/150 350/350 - 1s - 2ms/step - accuracy: 0.8795 - loss: 0.5314 - val_accuracy: 0.7422 - val_loss: 0.6900 Epoch 88/150 350/350 - 1s - 2ms/step - accuracy: 0.8821 - loss: 0.5286 - val_accuracy: 0.7656 - val_loss: 0.6013 Epoch 89/150 350/350 - 1s - 4ms/step - accuracy: 0.8846 - loss: 0.5252 - val_accuracy: 0.7022 - val_loss: 0.8074 Epoch 90/150 350/350 - 1s - 4ms/step - accuracy: 0.8810 - loss: 0.5327 - val_accuracy: 0.7006 - val_loss: 0.8286 Epoch 91/150 350/350 - 2s - 6ms/step - accuracy: 0.8849 - loss: 0.5260 - val_accuracy: 0.7478 - val_loss: 0.6701 Epoch 92/150 350/350 - 1s - 3ms/step - accuracy: 0.8868 - loss: 0.5208 - val_accuracy: 0.7139 - val_loss: 0.7886 Epoch 93/150 350/350 - 1s - 4ms/step - accuracy: 0.8865 - loss: 0.5152 - val_accuracy: 0.7450 - val_loss: 0.6993 Epoch 94/150 350/350 - 2s - 5ms/step - accuracy: 0.8846 - loss: 0.5225 - val_accuracy: 0.7061 - val_loss: 0.7862 Epoch 95/150 350/350 - 1s - 3ms/step - accuracy: 0.8838 - loss: 0.5143 - val_accuracy: 0.7728 - val_loss: 0.6082 Epoch 96/150 350/350 - 1s - 4ms/step - accuracy: 0.8855 - loss: 0.5147 - val_accuracy: 0.7061 - val_loss: 0.8072 Epoch 97/150 350/350 - 1s - 3ms/step - accuracy: 0.8851 - loss: 0.5146 - val_accuracy: 0.7233 - val_loss: 0.7554 Epoch 98/150 350/350 - 1s - 2ms/step - accuracy: 0.8861 - loss: 0.5151 - val_accuracy: 0.7417 - val_loss: 0.7011 Epoch 99/150 350/350 - 1s - 2ms/step - accuracy: 0.8893 - loss: 0.5079 - val_accuracy: 0.7044 - val_loss: 0.8380 Epoch 100/150 350/350 - 2s - 5ms/step - accuracy: 0.8866 - loss: 0.5102 - val_accuracy: 0.7600 - val_loss: 0.6519 Epoch 101/150 350/350 - 1s - 4ms/step - accuracy: 0.8899 - loss: 0.5057 - val_accuracy: 0.6922 - val_loss: 0.8663 Epoch 102/150 350/350 - 2s - 6ms/step - accuracy: 0.8875 - loss: 0.5086 - val_accuracy: 0.7494 - val_loss: 0.6896 Epoch 103/150 350/350 - 1s - 3ms/step - accuracy: 0.8883 - loss: 0.5015 - val_accuracy: 0.7306 - val_loss: 0.7848 Epoch 104/150 350/350 - 1s - 2ms/step - accuracy: 0.8918 - loss: 0.5011 - val_accuracy: 0.7622 - val_loss: 0.6702 Epoch 105/150 350/350 - 1s - 3ms/step - accuracy: 0.8874 - loss: 0.5023 - val_accuracy: 0.7456 - val_loss: 0.7153 Epoch 106/150 350/350 - 1s - 4ms/step - accuracy: 0.8919 - loss: 0.4965 - val_accuracy: 0.7283 - val_loss: 0.7887 Epoch 107/150 350/350 - 1s - 2ms/step - accuracy: 0.8923 - loss: 0.4943 - val_accuracy: 0.7256 - val_loss: 0.7511 Epoch 108/150 350/350 - 1s - 2ms/step - accuracy: 0.8928 - loss: 0.4851 - val_accuracy: 0.7533 - val_loss: 0.7014 Epoch 109/150 350/350 - 1s - 2ms/step - accuracy: 0.8919 - loss: 0.4895 - val_accuracy: 0.7383 - val_loss: 0.7544 Epoch 110/150 350/350 - 1s - 4ms/step - accuracy: 0.8917 - loss: 0.4930 - val_accuracy: 0.7322 - val_loss: 0.7723 Epoch 111/150 350/350 - 2s - 4ms/step - accuracy: 0.8941 - loss: 0.4841 - val_accuracy: 0.7456 - val_loss: 0.7151 Epoch 112/150 350/350 - 1s - 4ms/step - accuracy: 0.8947 - loss: 0.4866 - val_accuracy: 0.7550 - val_loss: 0.6901 Epoch 113/150 350/350 - 1s - 4ms/step - accuracy: 0.8933 - loss: 0.4915 - val_accuracy: 0.7522 - val_loss: 0.6960 Epoch 114/150 350/350 - 1s - 2ms/step - accuracy: 0.8936 - loss: 0.4828 - val_accuracy: 0.7422 - val_loss: 0.7409 Epoch 115/150 350/350 - 1s - 4ms/step - accuracy: 0.8940 - loss: 0.4854 - val_accuracy: 0.7467 - val_loss: 0.7221 Epoch 116/150 350/350 - 1s - 2ms/step - accuracy: 0.8922 - loss: 0.4872 - val_accuracy: 0.7167 - val_loss: 0.8352 Epoch 117/150 350/350 - 1s - 3ms/step - accuracy: 0.8947 - loss: 0.4801 - val_accuracy: 0.7622 - val_loss: 0.6961 Epoch 118/150 350/350 - 1s - 2ms/step - accuracy: 0.8951 - loss: 0.4703 - val_accuracy: 0.7311 - val_loss: 0.7586 Epoch 119/150 350/350 - 1s - 4ms/step - accuracy: 0.8970 - loss: 0.4746 - val_accuracy: 0.7600 - val_loss: 0.6793 Epoch 120/150 350/350 - 1s - 4ms/step - accuracy: 0.8962 - loss: 0.4738 - val_accuracy: 0.7556 - val_loss: 0.7039 Epoch 121/150 350/350 - 1s - 2ms/step - accuracy: 0.8977 - loss: 0.4697 - val_accuracy: 0.7278 - val_loss: 0.8232 Epoch 122/150 350/350 - 1s - 4ms/step - accuracy: 0.8959 - loss: 0.4689 - val_accuracy: 0.7322 - val_loss: 0.7667 Epoch 123/150 350/350 - 2s - 5ms/step - accuracy: 0.8985 - loss: 0.4699 - val_accuracy: 0.7122 - val_loss: 0.8486 Epoch 124/150 350/350 - 2s - 6ms/step - accuracy: 0.8960 - loss: 0.4721 - val_accuracy: 0.7311 - val_loss: 0.7834 Epoch 125/150 350/350 - 1s - 4ms/step - accuracy: 0.8989 - loss: 0.4660 - val_accuracy: 0.7756 - val_loss: 0.6642 Epoch 126/150 350/350 - 1s - 4ms/step - accuracy: 0.8998 - loss: 0.4643 - val_accuracy: 0.7311 - val_loss: 0.7599 Epoch 127/150 350/350 - 1s - 4ms/step - accuracy: 0.8996 - loss: 0.4659 - val_accuracy: 0.7533 - val_loss: 0.7337 Epoch 128/150 350/350 - 1s - 4ms/step - accuracy: 0.8966 - loss: 0.4672 - val_accuracy: 0.7550 - val_loss: 0.7291 Epoch 129/150 350/350 - 1s - 4ms/step - accuracy: 0.9004 - loss: 0.4621 - val_accuracy: 0.7239 - val_loss: 0.8285 Epoch 130/150 350/350 - 1s - 2ms/step - accuracy: 0.9012 - loss: 0.4533 - val_accuracy: 0.7383 - val_loss: 0.7684 Epoch 131/150 350/350 - 1s - 4ms/step - accuracy: 0.8995 - loss: 0.4608 - val_accuracy: 0.7317 - val_loss: 0.7965 Epoch 132/150 350/350 - 2s - 5ms/step - accuracy: 0.9012 - loss: 0.4529 - val_accuracy: 0.7544 - val_loss: 0.7456 Epoch 133/150 350/350 - 1s - 4ms/step - accuracy: 0.8990 - loss: 0.4535 - val_accuracy: 0.7422 - val_loss: 0.7357 Epoch 134/150 350/350 - 2s - 6ms/step - accuracy: 0.8993 - loss: 0.4571 - val_accuracy: 0.7239 - val_loss: 0.8273 Epoch 135/150 350/350 - 1s - 4ms/step - accuracy: 0.9024 - loss: 0.4483 - val_accuracy: 0.7361 - val_loss: 0.7923 Epoch 136/150 350/350 - 1s - 2ms/step - accuracy: 0.9023 - loss: 0.4513 - val_accuracy: 0.6894 - val_loss: 0.9636 Epoch 137/150 350/350 - 1s - 4ms/step - accuracy: 0.8999 - loss: 0.4563 - val_accuracy: 0.7433 - val_loss: 0.7388 Epoch 138/150 350/350 - 1s - 4ms/step - accuracy: 0.9030 - loss: 0.4510 - val_accuracy: 0.7378 - val_loss: 0.8057 Epoch 139/150 350/350 - 1s - 2ms/step - accuracy: 0.9043 - loss: 0.4471 - val_accuracy: 0.7561 - val_loss: 0.7209 Epoch 140/150 350/350 - 1s - 4ms/step - accuracy: 0.9013 - loss: 0.4464 - val_accuracy: 0.7628 - val_loss: 0.7358 Epoch 141/150 350/350 - 1s - 2ms/step - accuracy: 0.9033 - loss: 0.4481 - val_accuracy: 0.7556 - val_loss: 0.7471 Epoch 142/150 350/350 - 1s - 2ms/step - accuracy: 0.9045 - loss: 0.4406 - val_accuracy: 0.7172 - val_loss: 0.8934 Epoch 143/150 350/350 - 1s - 3ms/step - accuracy: 0.9020 - loss: 0.4463 - val_accuracy: 0.7361 - val_loss: 0.8309 Epoch 144/150 350/350 - 1s - 4ms/step - accuracy: 0.9049 - loss: 0.4393 - val_accuracy: 0.7422 - val_loss: 0.7738 Epoch 145/150 350/350 - 2s - 6ms/step - accuracy: 0.9040 - loss: 0.4385 - val_accuracy: 0.7478 - val_loss: 0.7739 Epoch 146/150 350/350 - 1s - 2ms/step - accuracy: 0.9055 - loss: 0.4394 - val_accuracy: 0.7739 - val_loss: 0.7023 Epoch 147/150 350/350 - 1s - 2ms/step - accuracy: 0.9059 - loss: 0.4381 - val_accuracy: 0.7522 - val_loss: 0.7679 Epoch 148/150 350/350 - 1s - 4ms/step - accuracy: 0.9069 - loss: 0.4326 - val_accuracy: 0.7239 - val_loss: 0.8780 Epoch 149/150 350/350 - 1s - 4ms/step - accuracy: 0.9077 - loss: 0.4370 - val_accuracy: 0.7311 - val_loss: 0.8451 Epoch 150/150 350/350 - 1s - 2ms/step - accuracy: 0.9069 - loss: 0.4352 - val_accuracy: 0.7644 - val_loss: 0.7285
#Plot Accuracy curve
plot(history19,'accuracy')
#plot Loss curve
plot(history19,'loss')
#comparing model performance
results.loc[19] = [2,[64,32],['relu','relu'],150,32,"adam",['-', "-"],['He',"He"],"-",history19.history["loss"][-1],history19.history["val_loss"][-1],history19.history["accuracy"][-1],history19.history["val_accuracy"][-1],round(end-start,2)]
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
| 15 | 2 | [64, 32] | [tanh, relu] | 150 | 64 | adam | [0.01, -] | xavier | [dropout-0.2, dropout-0.1] | 0.812251 | 0.474700 | 0.831714 | 0.803889 | -7129.34 |
| 16 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [SMOTE] | 0.181068 | 0.819016 | 0.932826 | 0.744554 | -7129.34 |
| 17 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | SGD | [-, -] | xavier | [SMOTE] | 0.441068 | 0.420571 | 0.793023 | 0.796557 | -7129.34 |
| 18 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [0.01, -] | xavier | [dropout:0.2, dropout:0.1] | 0.796456 | 0.548435 | 0.805367 | 0.747716 | -7129.34 |
| 19 | 2 | [64, 32] | [relu, relu] | 150 | 32 | adam | [-, -] | [He, He] | - | 0.435168 | 0.728501 | 0.906887 | 0.764444 | -7129.34 |
#model performance with training data
model_performance_classification(model18, X_train_over, y_train_over)
350/350 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.806798 | 0.806798 | 0.85225 | 0.800358 |
#model performance with validation data
model_performance_classification(model18, X_val, y_val)
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step
| Accuracy | Recall | Precision | F1 Score | |
|---|---|---|---|---|
| 0 | 0.640556 | 0.640556 | 0.819554 | 0.673358 |
Observations
- No much more improvement.Validation loss is more than training.It indicates overfitting.
Model Performance Comparison and Final Model Selection¶
Comparing Models Permance using restlts dataset and then comparing there recall values and f1_score.
results
| # hidden layers | # neurons - hidden layer | activation function - hidden layer | # epochs | batch size | optimizer | learning rate, momentum | weight initializer | regularization | train loss | validation loss | train accuracy | validation accuracy | time (secs) | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2 | [64, 32] | [relu, relu] | 100 | 7000 | sgd | [-, -] | xavier | - | 1.251845 | 0.608909 | 0.724714 | 0.726667 | -7129.34 |
| 1 | 2 | [64, 32] | [relu, relu] | 100 | 64 | sgd | [-, -] | xavier | - | 0.841307 | 0.463114 | 0.811714 | 0.774444 | -7129.34 |
| 2 | 1 | 64 | relu | 100 | 64 | sgd | [-, -] | xavier | - | 0.934649 | 0.493455 | 0.782857 | 0.756111 | -7129.34 |
| 3 | 2 | [64, 32] | [relu, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.830580 | 0.404369 | 0.812571 | 0.813889 | -7129.34 |
| 4 | 2 | [128, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.858299 | 0.444290 | 0.809143 | 0.795556 | -7129.34 |
| 5 | 2 | [64, 32] | [tanh, tanh] | 150 | 64 | sgd | [-, -] | xavier | - | 0.866516 | 0.405230 | 0.806857 | 0.817222 | -7129.34 |
| 6 | 2 | [64, 32] | [relu, tanh] | 150 | 128 | sgd | [0.001, 0.3] | xavier | - | 1.091327 | 0.538426 | 0.721571 | 0.720000 | -7129.34 |
| 7 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | sgd | [0.001, 0.9] | xavier | - | 0.769514 | 0.468413 | 0.826857 | 0.778333 | -7129.34 |
| 8 | 2 | [64, 32] | [relu, tanh] | 50 | 32 | sgd | [0.001, 0.7] | xavier | - | 0.853101 | 0.477964 | 0.810714 | 0.767778 | -7129.34 |
| 9 | 2 | [64, 32] | [relu, tanh] | 100 | 64 | sgd | [0.001, 0.4] | xavier | - | 1.085786 | 0.530878 | 0.714429 | 0.721667 | -7129.34 |
| 10 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2] | 0.892885 | 0.456302 | 0.804857 | 0.776111 | -7129.34 |
| 11 | 2 | [64, 32] | [relu, tanh] | 100 | 32 | sgd | [0.001, 0.9] | xavier | [dropout:0.2, BatchNormalization] | 0.808190 | 0.446884 | 0.810143 | 0.791111 | -7129.34 |
| 12 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | - | 0.571309 | 0.524371 | 0.866714 | 0.786111 | -7129.34 |
| 14 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [dropout:0.5, dropout:0.5] | 0.905916 | 0.450181 | 0.806286 | 0.790556 | -7129.34 |
| 15 | 2 | [64, 32] | [tanh, relu] | 150 | 64 | adam | [0.01, -] | xavier | [dropout-0.2, dropout-0.1] | 0.812251 | 0.474700 | 0.831714 | 0.803889 | -7129.34 |
| 16 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [-, -] | xavier | [SMOTE] | 0.181068 | 0.819016 | 0.932826 | 0.744554 | -7129.34 |
| 17 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | SGD | [-, -] | xavier | [SMOTE] | 0.441068 | 0.420571 | 0.793023 | 0.796557 | -7129.34 |
| 18 | 2 | [64, 32] | [relu, tanh] | 150 | 32 | adam | [0.01, -] | xavier | [dropout:0.2, dropout:0.1] | 0.796456 | 0.548435 | 0.805367 | 0.747716 | -7129.34 |
| 19 | 2 | [64, 32] | [relu, relu] | 150 | 32 | adam | [-, -] | [He, He] | - | 0.435168 | 0.728501 | 0.906887 | 0.764444 | -7129.34 |
Observations
- From the above dataframe,we can choose model 15,12,5 and 3 which have less loss and better accuracy for training and validatation data.
Now,Compare there performance with respect to Accuracy,Recall,Precision and F1_score
models_train_comp_df = pd.concat(
[model_performance_classification(model15, X_train, y_train).T,
model_performance_classification(model12, X_train, y_train).T,
model_performance_classification(model5, X_train, y_train).T,
model_performance_classification(model3, X_train, y_train).T,
],
axis=1,
)
models_train_comp_df.columns = ["Model15 With Adam with dropout", "Model12 with Adam", "Model5-Sgd[tanh,tanh]", "Model3-Sgd[relu, tanh]"]
print("Training performance comparison:")
models_train_comp_df
219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 219/219 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step Training performance comparison:
| Model15 With Adam with dropout | Model12 with Adam | Model5-Sgd[tanh,tanh] | Model3-Sgd[relu, tanh] | |
|---|---|---|---|---|
| Accuracy | 0.853714 | 0.859571 | 0.831143 | 0.840143 |
| Recall | 0.853714 | 0.859571 | 0.831143 | 0.840143 |
| Precision | 0.882311 | 0.898801 | 0.854051 | 0.860450 |
| F1 Score | 0.861997 | 0.869130 | 0.839051 | 0.847176 |
models_val_comp_df = pd.concat(
[model_performance_classification(model15, X_val, y_val).T,
model_performance_classification(model12, X_val, y_val).T,
model_performance_classification(model5, X_val, y_val).T,
model_performance_classification(model3, X_val, y_val).T,
],
axis=1,
)
models_val_comp_df.columns = ["Model15 With Adam with dropout", "Model12 with Adam", "Model5-Sgd[tanh,tanh]", "Model3-Sgd[relu, tanh]"]
print("Validation performance comparison:")
models_val_comp_df
57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step 57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step 57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step 57/57 ━━━━━━━━━━━━━━━━━━━━ 0s 1ms/step Validation performance comparison:
| Model15 With Adam with dropout | Model12 with Adam | Model5-Sgd[tanh,tanh] | Model3-Sgd[relu, tanh] | |
|---|---|---|---|---|
| Accuracy | 0.803889 | 0.786111 | 0.817222 | 0.813889 |
| Recall | 0.803889 | 0.786111 | 0.817222 | 0.813889 |
| Precision | 0.835261 | 0.825131 | 0.841289 | 0.836804 |
| F1 Score | 0.814333 | 0.798836 | 0.825580 | 0.822053 |
Observations
- From above comparision table Model5 with SGD is the winner.
- But also we will check it's performance with unseen data before giving any conclusion.
Model evaluation on the test data
Model 12
#evaluate model 12 with test data
model12.evaluate(X_test,y_test)
test_pred=model12.predict(X_test)
38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7789 - loss: 0.5422 38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
from sklearn.metrics import classification_report
from sklearn.metrics import confusion_matrix
# Convert predictions to binary using a threshold (e.g., 0.5)
test_pred_binary = np.where(test_pred > 0.5, 1, 0)
# Now use the binary predictions in the classification_report
print(classification_report(y_test, test_pred_binary))
cm = confusion_matrix(y_test, test_pred_binary)
plt.figure(figsize=(8,5))
sns.heatmap(cm, annot=True, fmt='.0f',xticklabels=['Not Exited', 'Exited'], yticklabels=['Not Exited', 'Exited'])
plt.ylabel('Actual')
plt.xlabel('Predicted')
plt.show()
precision recall f1-score support
0 0.91 0.82 0.86 950
1 0.49 0.68 0.57 250
accuracy 0.79 1200
macro avg 0.70 0.75 0.72 1200
weighted avg 0.82 0.79 0.80 1200
***Model 15**
#evaluate model 14 with test data
model15.evaluate(X_test,y_test)
test_pred=model15.predict(X_test)
38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 4ms/step - accuracy: 0.8160 - loss: 0.4735 38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 5ms/step
# Convert predictions to binary using a threshold (e.g., 0.5)
test_pred_binary = np.where(test_pred > 0.5, 1, 0)
# Now use the binary predictions in the classification_report
print(classification_report(y_test, test_pred_binary))
cm = confusion_matrix(y_test, test_pred_binary)
plt.figure(figsize=(8,5))
sns.heatmap(cm, annot=True, fmt='.0f',xticklabels=['Not Exited', 'Exited'], yticklabels=['Not Exited', 'Exited'])
plt.ylabel('Actual')
plt.xlabel('Predicted')
plt.show()
precision recall f1-score support
0 0.91 0.85 0.88 950
1 0.54 0.69 0.61 250
accuracy 0.81 1200
macro avg 0.73 0.77 0.74 1200
weighted avg 0.84 0.81 0.82 1200
Model 5
##evaluate model 7 with test data
model5.evaluate(X_test,y_test)
test_pred=model5.predict(X_test)
38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.8175 - loss: 0.4029 38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step
# Convert predictions to binary using a threshold (e.g., 0.5)
test_pred_binary = np.where(test_pred > 0.5, 1, 0)
# Now use the binary predictions in the classification_report
print(classification_report(y_test, test_pred_binary))
cm = confusion_matrix(y_test, test_pred_binary)
plt.figure(figsize=(8,5))
sns.heatmap(cm, annot=True, fmt='.0f',xticklabels=['Not Exited', 'Exited'], yticklabels=['Not Exited', 'Exited'])
plt.ylabel('Actual')
plt.xlabel('Predicted')
plt.show()
precision recall f1-score support
0 0.92 0.86 0.89 950
1 0.56 0.70 0.62 250
accuracy 0.82 1200
macro avg 0.74 0.78 0.76 1200
weighted avg 0.84 0.82 0.83 1200
Model 3
#Evaluate model 3
model3.evaluate(X_test,y_test)
test_pred=model3.predict(X_test)
38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8221 - loss: 0.4165 38/38 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step
# Convert predictions to binary using a threshold (e.g., 0.5)
test_pred_binary = np.where(test_pred > 0.5, 1, 0)
# Now use the binary predictions in the classification_report
print(classification_report(y_test, test_pred_binary))
cm = confusion_matrix(y_test, test_pred_binary)
plt.figure(figsize=(8,5))
sns.heatmap(cm, annot=True, fmt='.0f',xticklabels=['Not Exited', 'Exited'], yticklabels=['Not Exited', 'Exited'])
plt.ylabel('Actual')
plt.xlabel('Predicted')
plt.show()
precision recall f1-score support
0 0.91 0.86 0.88 950
1 0.56 0.67 0.61 250
accuracy 0.82 1200
macro avg 0.73 0.77 0.75 1200
weighted avg 0.84 0.82 0.83 1200
Observations
- From the above results,model 5 and 3 give little bit more recall and F1_score than others models.Hence,model 5 with two hidden layer with activation function tanh and model 3 with SGD optimizer with two hidden layers [64,32] with relu and tanh activation function are the winner.
Actionable Insights and Business Recommendations¶
- Bank should consider the final model from this project to predict with a reasonable degree of accuracy whether bank customer is likely to exite or not, and this process seems to be easier and more time-efficient than other methods.
- Bank need to find out why old age customers are exited than young one.
- Retaining older customers in a bank requires a mix of personalized service, trust-building, digital adaptation, and value-driven engagement.By understanding their needs, simplifying their banking experience, and adding value, you can retain older customers and prevent churn.
- Bank should
- Loyalty Programs: Reward them for their long-term association with bonus interest rates, cashback, or special investment options.
- Referral Benefits: Encourage them to bring family members by offering family banking benefits or joint account perks.
- Personalized Offers: Send targeted offers based on their banking behavior—discounts on medical services, travel benefits, or customized investment plans.
- As we observed, German customers has high exite ratio than other locations.So that try to find out cause behind it.Start more loyalty programs, referral benefit as I mensioned early.
- Further investigate why customers who purchased more than 2 products have such a high churn rate.To attract customers to a banking product, you need a customer-centric strategy that highlights the unique benefits, ease of use, and trustworthiness of the product.
- Try to understand customers need.To attract customers, a banking product must be valuable, easy to use, and well-promoted. Combining competitive offers, digital convenience, targeted marketing, and trust-building will ensure success.
!jupyter nbconvert --to html /Users/kirtikamerkar/Downloads/NN_Project4_Bank_Churn_Prediction_Kirti_Kamerkar%20(1).ipynb
zsh:1: no matches found: /Users/kirtikamerkar/Downloads/NN_Project4_Bank_Churn_Prediction_Kirti_Kamerkar%20(1).ipynb
Power Ahead